I'm working on trying to get video in via MIPI-CSI to the i.mx8MPlus...starting with the EVK (8MPLUS-BB) and later moving to a custom PCB.
My image sensor is a Sony imx287, for which there currently exists no public driver outside of modules that have integrated it with an ASIC and control it through the ASIC. Fortunately, I have the datasheet & list of extra initialization registers from our parts vendor and have a good idea how to basically control it over I2C.
In my case, we are putting down the raw sensor with custom optics (no Bayer filter--we essentially use the sensor in monochrome) and convert the Sony sub-LVDS output to MIPI-CSI2 via a Lattice FPGA.
Our (initial) needs for control are pretty basic. We just want to initialize the camera to run all-pixel, at maximum framerate, with 10-bit RAW output, and just control start & stop streaming. Later, we will probably add external trigger control for affecting exposure, and maybe blacklevel & gain controls.
The initial plan before I was brought onto the project was to control the sensor via I2C from the M7 core, with the video output via the Lattice bridge into the 1st CSI port on the i.mx8mp, and use GStreamer on an A53 core to pick up the stream and either save off images or ship the stream over the network.
That is where I joined the project. Nobody had a plan for how to setup the MIPI input--I think it was just assumed that GStreamer would hook up to the MIPI port & data would flow when you told the sensor to start streaming. I realized there was a need for an actual Linux camera driver, as Gstreamer
needs to talk to a device.
So we are moving the I2C peripheral to the A53 to allow a more proper Linux camera driver.
I started digging into V4L2, which I have never written for before. I had a look at the I.MX8MPLUS camera sensor porting guide and see reference to "vvcam" and the ISI and ISP, and some need for an xml file to describe the Bayer filter (we have none) and lens parameters, etc.
This all seems much more complicated than just a V4L2 sub-device & a device-tree file to inform that the sub-device utilizes the MIPI-CSI1 port on the i.mx8mplus.
So my 1st set of questions are:
1) is it required to utilize the ISI and Vivante ISP? Or can it be a simple as a "standard" V4L2 sub-device (mostly an i2c driver with the hooks that V4L2 needs and ioctl defs, etc.) with a dts file to "wire it up"?
2) if Vivante ISP is required, what do you do if you are operating without a Bayer filter and don't want anything done about lens parameters (dewarp, etc.)? I just want the raw 10-bit images. We have a PC-based application that processes those later.
Many thanks for any guidance you can provide.
After a lot of work, I got it to work with the ISI. I tried to avoid the ISP because it is really involved to get set up, and we didn't have time (or need in our use-case) to go through a lens calibration like the ISP requires. There is probably some identity matrix that can be used for that, but I found the ISP to be pretty difficult to develop for. In the end, I added support for RAW10 to the ISI and after a lot of trial and error, got things working. In our case, we are trying to get a high frame-rate and the existing code seems to cap out about 90fps (context-switching overhead going from kernel space to user space). I'm in the midst of finishing a modification to the ISI driver to capture 4 frames per V4L buffer, which is looking like its going to allow us to get all the frames to user-space. Next challenge will be how to unpack again & not lose framerate, but at least I should be able to take advantage of multiple cores there.
If you use the ISI, just be aware that it is set up with the expectation that the camera has already gone through an ISP and is outputting a color format (either RGB or YUV). Getting your clocks setup right is critical. The ISI *is* capable of understanding incoming RAW formats and dealing with them properly, but it has a lot of quirks you have to learn by experimentation and re-reading the manual over & over.
How are you interfacing? In our case, we use a Lattice FPGA (and older Mach XO3) with their IP. Their IP had some issues, especially with the fact that our IMX287 has a lower resolution than it was designed for. If you are on a newer Crosslink series part, it may go better for you. We had to put a bit of a hack in the FPGA code to get data through properly.
Hello @DWSmith
I have a very similar use case, though using the imx250 sensor. I'm confident on the sensor itself (there already exists an implementation on a different platform), but there is some confusion on the ISP. Did you make any progress on this?