Hi all,
Does anyone have experience changing regster settings on the OmniVision OV5642 sensor? I am using the Boundary Devices SabreLite i.mx6 board as a host. I am trying to send raw sensor data, at full resolution, as fast as possible, across the parallel camera interface. I do not care about bit depth or color.
The OV5642 datasheet claims that it is rated for 15fps output when capturing a full-resolution image (2592x1944). However, when testing the interface with Gstreamer, I top out at exactly 7.5fps. Scoping the pixel clock on my sensor, I surmise that the output clock is running at 96 MHz. Additionally, one frame of data output captured from the sensor is a ~7.5 megabyte file, corresponding to 12 bits per pixel. Looking at the ov5642 driver source, I see that camera data is being packed as YUV422, confirming the 12-bit depth. Since I'm using an 8-lane parallel interface, I surmise that this means that it takes two clock cycles to send one pixel's worth of data.
Using the equations on page 7 of this document, I can calculate that such an interface running at 96MHz will top out at sending 9.52fps, meaning that 12-bit data at 15fps won't work. This suspicion is confirmed by the following line from the ov5642 datasheet: "96Mhz is for sensor RAW data output at 15fps or YUV output at 7.5fps. For higher speeds such as 5 Megapixel YUV @ 15fps, OmniVision recommends using the MIPI two-lane interface."
So, I figure that if I can convince the ov5642 to support an 8-bit image format, I should be able to double the throughput of the parallel interface, and finally get to that 15fps. To this end, I have two questions on implementation.
(1) I have begun implementing the patch instructions described here and here to enable 8-bit raw/grayscale support in the Freescale v4l2 driver stack. However, implementing the patches described in the second forum thread leads to a weird error: though no build errors pop up and I'm modifying neither the device driver itself nor any makefiles, the target driver (ov5642_capture.ko) is not included in the final build directory, although the corresponding .o file is created. I am not sure what could be causing this issue. Even more frustratingly, if I run one build which results in this and then roll back all of the patches, ideally to return to a stock image, the kernel module is still not compiled or included. In this state I am able to confirm that the source code generating the stock image successfully compiles the kernel modules by using "bitbake virtual/kernel -c compile_kernelmodules" - however, the drivers still do not appear in the final image. I have no idea why this happens or how to fix it.
(2) As far as I know, I need to patch two things in the device driver itself (ov5642.c): the registers controlling data format for the full-resolution capture mode, and the actual listed data output format, so the v4l2 platform drivers know what to do with the data. To this end I have changed the output format in ov5642.c to V4L2_PIX_FMT_GREY.
I am also modifying the camera registers that are set in the driver to support raw data output rather than YUV422. Based on this application note, which has a general set of registers to change for raw mode, and the datasheet, it seems that I need to be doing something with register 0x4300 and/or 0x501f. I am currently setting 0x4300 to 0xf8 (raw data, bypassing the formatter module) and register 0x501f to 0x03 (setting the format mux to ISP RAW). Is this a reasonable approach? If anyone has other advice on the most straightforward way to receive and process 8-bit data from this sensor, I would love to hear it - I'm not wedded to raw or grayscale by any means, but it seems to be the easiest approach to bringup.
Any and all suggestions or advice would be greatly appreciated; thanks very much.