I want to connect two SABRE Boards (https://www.nxp.com/support/developer-resources/hardware-development-tools/sabre-development-system/sabre-board-for-smart-devices-based-on-the-i.mx-6quadplus-applications-processors:RD-IMX6QP-SABRE) via PCIe. After reading this document (i.MX6Q PCIe EP/RC Validation System: https://community.nxp.com/docs/DOC-95014), I’ve got some ideas how I might proceed with this task. Indeed I have some questions as follows:
- Where can I find some kind of use-cases/applications for connecting two SABRE boards over PCIe? Basically, I want to copy a bulk of data from one processor to a pre-allocated segment of the memory of the another processor.
- Moreover, I want to use the high performance SDMA on iMX6 for interfacing with memory. Where can I find very good documents about SDMA setup for PCIe on iMX6?
- As I understood, for enabling the PCIe Root Complex and End Point on two boards, I should build the Linux by means of Yocto (OpenEmbedded). Would you please let me know which version of Linux kernel, and U-Boot I shall use for the “iMX6” to have both PCIe RC and EP on two nodes of my network? Is imx-4.1.15-1.0.0 the right version? Would you please inform me how I may configure your recommended Linux Kernel to provide the PCIe RC or EP on two iMX6 processors? How may I configure the Kernel to reserve a block of memory as a buffer for the SDMA to copy the data into it?