I have a Freescale LS2080 box for which I am developing a custom linux 4.1.8 kernel using the Freescale Yocto project.
I have an NVMe hard disk attached to the LS2080 via a PCIe card, but the disk is not recognised when I boot up the board with my custom linux kernel. In fact, it is not recognised in u-boot either if I do
I plugged the same combination of NVMe disk and PCIe card into a Debian linux 3.16.7 desktop PC and it was detected and mounted without problem. I also repeated the experiment with a different PCIe card and different NVMe SSD and got the same result.
When building the LS2080 kernel using the Yocto project, I have enabled the NVMe block device driver and I have verified that this module is present in the kernel when booting on the board. The PCIe slot on the board is working fine because I have tried it with a PCIe Ethernet card and a PCIe SATA disk.
The only thing I can think of is that the Ethernet card is 1x lane PCIe device, and the SATA disk is 2x lanes, while the NVMe is 4x lanes - perhaps the 4x lanes aren't working on the board, or the DIP switches are set incorrectly?
I suspect that I am missing something in the kernel configuration or device tree, but I'm not sure what. When I add the NVMe driver to the kernel using menuconfig, the NVMe driver dependencies are supposed to be resolved.
Can anyone provide insight into what I am missing?