We have an own IMX6QP board communicating with an own CMOS sensor board (Sony IMX290 based) via MIPI interface. We developed a camera driver and also modified some of the NXP BSP files and more or less the solution is working, but we are experiencing a strange behavior apparently related with the vertical sync which we are not able to solve.
Camera streams data correctly (all MIPI_CSI registers indicate correct value) but we are not receiving the start of frame at the correct position.
For debugging purposes, the camera is sending a pattern, fixing all pixels of a cocnrete row to 0x3FF (all FFs with 10 bits) but we never found this pattern at the indicated row but in other rows, changing with each frame It has some kind of “repetitive” behavior
Camera is working in 10 bits mode. It means, each 4 pixels are encoded with 5 bytes according the MIPI
Camera is working at 1920x1080
Camera uses 4 MIPI lanes working at 222.75Mbps/lane
Horizontal sync seems to be working correctly
Framerate 1fps (adjusted in sensor with some registers: FRSEL which fixes 1H time to 29.6us and VMAX which fixes the number of lines). Same issue happens at higher framerates.
Patches applied against NXP last kernel:
Our capture test software is based on standard https://linuxtv.org/downloads/v4l-dvb-apis/uapi/v4l/capture.c.html adapted to our image formats.
It seems there is some relation between the vsync issue and the sensor register WINWV_OB (who fixes the vertical Out Band region of the image). Modifying this register has direct influence on row displacement.
Some frames are “lost”. In the example sent we are capturing 7 frames, but receiving 11 end-of-frame interrupts. Some of them seems are not “arriving” to our DQUEUE
Attached a video showing the effect.
Any ideas of what can be happening?
Thanks for the feedback!