We are using i.MX6 to process video from HD camera to show live preview on the screen, record the short video clips, etc. We use GStreamer to achieve this, but the delay between camera and display is higher than what we expected. What could we do to lower the video latency?
- Variscite VAR-SOM-MX6 Solo (i.MX6 Solo based SoM) -- I know that this is not the standard reference board, but there are no HW issues, so I hope it does not matter.
- i.MX6 Solo
- 1 GB RAM
- custom carrier board for VAR-SOM-MX6 providing following:
- ADV7181C HDTV Video decoder -- Used to convert analog YPbPr signal to 16-bit YPbPr (connected to CSI interface)
- data connections (HDMI, UARTs, Ethernet, ...)
- HD video camera (Sony FCB-EH6300) -- We are currently using 1080p/25 (1920x1200, 25 FPS, progressive) mode.
- custom LCD monitor (FullHD IPS panel and chip providing the HDMI connection)
- our Linux image has been bitbaked using Variscite's Yocto Dora release based on fsl-L3.10.17_1.0.0GA release.
- kernel 3.0.35
- GStreamer 0.10.36
- gst-fsl-plugin 3.0.9
We use minimal GStreamer pipeline (only mfw_v4lsrc and mfw_isink, nothing more) to get the video signal from camera and show it on LCD.
We did some measurements on whole system with following results:
- video latency of whole system (from camera to LCD display) is around 270 ms, which breaks down to:
- camera latency of 120 ms (or 3 frames, because we use 25 fps, therefore each frame takes 40 ms) -- Since we cannot change this behavior, we do not care about it for now. Maybe we will switch mode to 50/60 fps or use different camera.
- ADV7181C causes no latency (there are no buffers in the chip).
- i.MX6 processing takes around 135 ms -- This is the part where we would like to save some time. We believe that the pipeline is so simple that there is no need to induce such delay.
- LCD panel processing takes few milliseconds (we cannot measure it precisely, but we cannot reduce it anyway, so there is no need to know precise value).
Below you can see oscilloscope output from our measurement, where we flashed bright light (white LED) directly into the camera, which causes bright flash of white on the LCD display (before the camera auto-gain even reacts), so there is rapid measurable change.
The channels show following:
- Channel 1 (yellow) shows voltage on the LED (therefore shows when we flashed the LED).
- Channel 2 shows Y component of YPrPb signal from camera. You can clearly see when the flash appears in the video signal, so we can measure the latency caused by the camera itself.
- Channel 3 shows one differential pair in HDMI signal to LCD display. We were quite surprised that you can observe the change in HDMI signal so easily (even when using only 70 MHz oscilloscope). We can use this information to calculate the latency caused by processing in i.MX6. The cursors clearly shows that the latency is around 135 ms.
- Channel 4 shows voltage on photodiode that was sensing the intensity of light from LCD panel, so it shows when the user can observe the change on LCD.
- We would like to minimize delay caused by GStreamer processing, which currently takes around 135 ms for us, while processing 1080p/25 video feed.
- Do you believe that there is any way to eliminate part of the delay? We know that there will always be some latency caused by processing, but we would like to eliminate it as much as possible.
- Is there any easy way to break-down the latency caused by GStreamer, so we would know what causes the delay (format conversion, buffer passing, whatever else happens in there, ...)?
- The other thread dedicated to video latency (i.MX6 Video Latency) did not helped us much, because we are already using mfw_isink, but our video delay is quite longer than what is described there (135 ms vs 59 ms). I am not sure what framerate has been used in that experiment though.