Capturing YUV420 from MIPI CSI camera for VPU compression

cancel
Showing results for 
Search instead for 
Did you mean: 

Capturing YUV420 from MIPI CSI camera for VPU compression

Jump to solution
10,124 Views
Contributor III

Hi guys,

I'm using the iMX6Q MIPI CSI and IPU to capture video from a camera.

My camera supports multiple data types, and I am able to capture with the data-types: RGB and YUV422.

I am now experimenting with YUV420, and I hope I'll be able to send the video stream into the VPU for compression (Which only supports YUV420 for most codecs).

Well, it is not clear by the iMX6 and IPU spec if I can use this YUV420 datatype:

According to IMX6DQRM.pdf I found that the MIPI component supports all formats, CSI2IPU component supports all format, but it seems that the IPU does not support YUV420 (it is missing from the description of register IPUx_CSI0_SENS_CONF)

If I understand correctly, I can only use YUV420 if I configure it as a GENERIC MIPI data, and I won't be able to have the IPU working on it (i.e. stream the frame through its conversion block, or format converter in the IDMAC)

However, since the VPU codecs can only work on YUV420, is it possible to create a capture loop which compresses the video without using CPU for format conversion ?

Please advice :smileyhappy:

Ofer

1 Solution
58 Views
NXP Employee
NXP Employee

1080p camera record was validated on MX6 Android. The similar data flow can be adopted in any Linux based OS: The data from CSI is YUYV/UYVY format ->  When the data is from CSI to memory, the format is NV12 through CPMEM -> VPU use NV12 for encode.

View solution in original post

0 Kudos
25 Replies
59 Views
NXP Employee
NXP Employee

1080p camera record was validated on MX6 Android. The similar data flow can be adopted in any Linux based OS: The data from CSI is YUYV/UYVY format ->  When the data is from CSI to memory, the format is NV12 through CPMEM -> VPU use NV12 for encode.

View solution in original post

0 Kudos
58 Views
Contributor III

Hi Xiaoli Zhang

Thanks! I was certain that this kind of pixel format conversion must go through the IC which is limited to 1024x1024 output.

Never the less, it works :smileyhappy:

I'll appreciate if you can help me figure out the following:

* I'll still prefer streaming YUV420 into the iMX6 instead of YUV422 - as less bandwidth equals less power consumption and less heat. If I understand correctly, this is still not doable. Am I correct ?

* Reading the manual again, I still find it hard to understand how does the IDMAC solve this YUV422->YUV420 conversion by itself. Is it the FCW block? I was quite certein it only capable of reordering of the bytes... Does it do subsampling of the chroma bytes, or is it something arithmetic?

*  If YUV422 to YUV420 conversion is done in the IDMAC, why does the IPU library limits the output to 1024x1024 even when all I am trying to do is format conversion (without resizing) ? Is it a software limitation which I can override? Or is there a hardware reason for that?

Thanks again!

Ofer

0 Kudos
58 Views
NXP Employee
NXP Employee

Following are some comments for your questions:

1. According to IPUv3 spec, CSI can not capture YUV420 pixel.

2. After CSI captures YUV422 pixels, YUV422 pixels are converted to YUV444 pixels. When pixels need to be written to memory, IDMAC converts YUV444 pixels to YUV420 pixels( CPMEM’s PFS, UBO and VBO entries determine the pixel format. Of course, UBO and VBO are only valid for non-interleaved and partial interleaved pixel formats. ).

3. IPU IC module has output limitation, which is clearly documented in IPUv3 spec, here is some quotation from IPUv3 spec:

  1. 39.3.1.5.3 Image Converter (IC)

Output: to system memory or (for a single active flow) to a display device (through

the DP).

Frame size: up to 1024x1024 pixels

IPU library has split mode, which is a software workaround for this IC limitation. For instance, to get a 1080P frame, IC task will be used by 4 times.

58 Views
Contributor IV

I would like CSI-->MEM to generate YUV422P data (ie non-interleaved YUV 4:2:2)

However when I specify this as the format with VIDIOC_S_FMT on DMA channel 0, it seems that the data being generated is YUV420P. I say this because when displayed the chroma signals are only painted on the top half of the screen and are squashed vertically. The bottom half of the screen is saturated with pink and blue. Throughout whole image the correct luma is apparent. Therefore a 4:2:0 planar format is being interpreted as a 4:2:2 planar format.

To prove this, I changed the offsets provided to the encoder:

   fb[src_fbid].bufY = v4l2_buf.m.offset;  //cap_buffers[v4l2_buf.index].offset; // I am encoding directly from the capture buffer

   fb[src_fbid].bufCb = fb[src_fbid].bufY + img_size;

   // see if we can interpret what should be 422P as 420P by changing the Cr offset

   fb[src_fbid].bufCr = fb[src_fbid].bufCb +

     img_size / 2; // position offset as if we are getting 422P

     // (img_size >> 2);

The encoded image was perfect! So although the driver is asking for 422P, and setting up the Cr and Cb offsets correctly to allow for enough space for each chroma signal to have all (ie not half) the scan lines, the IDMAC is generating 420P!! Is this a firmware bug??

I am trying to determine why the IDMAC would be wrongly decimating the data down to 4:2:0

0 Kudos
48 Views
NXP Employee
NXP Employee

Hi,

I think we've already verified generating YUV422P data (ie non-interleaved YUV 4:2:2) by using CSI-MEM path, because we've got v4l2 capture unit test to cover this. If you use Linux BSP instead of Android, you probably have already got the unit test code in linux-test.git. IIRC, the command to run the unit test to capture several VGA YUV422P frames in test.yuv file could be:

/unit_tests/mxc_v4l2_capture.out -iw 640 -ih 480 -ow 640 -oh 480 -f 422P -i 1 ./test.yuv


To show the captured frames on a XGA display by using v4l2 output, IIRC, the unit test command could be:

/unit_tests/mxc_v4l2_output.out -iw 640 -ih 480 -ow 1024 -oh 768 -d 3 -f 422P ./test.yuv


Regards,

Liu Ying


0 Kudos
48 Views
Contributor IV

Thanks, I think I have worked out what is happening.

I am deinterlacing using the CSI->MEM transfer by setting ILO (interlace offset) to the Y pixel stride and doubling SLY (Y Stride Length) so as to deinterlace into a single capture buffer.

However I have changed from non-planar (interleaved) YUV422 to planar (non-interleaved) YUV422P so as to facilitate conversion to YUV420P which is all the VPU encoder can accept.

What I believe is happening is that the chroma for the first field is being overwritten by the chroma for the second field.

The solution should be to double SLUV (U and V Stride Length) to deinterlace the chroma planes, just as for the Y plane.

However having tried this it is not working correctly and I wonder if this is because there is no ILO for the chroma planes, that is both fields are still being written over each other, rather than offset from each other.

Should the ILO setting be automatically offsetting the chroma planes as well as the Y plane for YUV422? This seems to be implied by the Reference Manual table 45-11:

For YUV420 formats, the ILO is relevant only to the Y component as the U and V components do not exist for the even lines.

Do I need to adjust the UBO, VBO or IOX parameters at the start of each field to offset the writing of the U and V planes?

0 Kudos
48 Views
Contributor IV

OK, seems like ILO does adjust chroma offset for each field if I set PFS to 3 for YUV 4:2:2 partial-interleaved, uv_stride = 2 * normal Y stride.

0 Kudos
48 Views
Contributor IV

Basically I have proof of concept, so I can display live deinterlaced PAL at low latency (without the VDIC) and simultaneously encode. The encoded video is partial-interleaved 420P, the chroma is resized from CSI's partial-interleaved 422P. The chroma is resized as if it is IPU_PIX_FMT_UYVY. Treating it as IPU_PIX_FMT_GENERIC didn't work even though it should be treated the same?

The i.MX53 hardware is very good but the documentation is a bit lacking and I think must scare potential customers away!

0 Kudos
48 Views
Contributor IV

The last obstacle is that the fields are flipped when I capture NTSC. I am not sure whether this is due to a faulty capture device.

I think it is related to analog PAL being "upper field first" and NTSC being "lower field first".

Regardless, what should I do to fix this in software? ie be able to deinterlace both V4L2_FIELD_INTERLACED_TB and V4L2_FIELD_INTERLACED_BT.
I have tried setting channel 0 Interlace Offset to a negative value (the Reference Manual says ILO is signed). However this prevented any video being captured at all.

0 Kudos
48 Views
Contributor IV

The solution to flipped NTSC field order is here

https://community.freescale.com/thread/306486

0 Kudos
58 Views
Contributor III

Thank you xiaoli.zhang for the detailed explanation.

I will try and use the split mode for that matter.

As it seems that you have valuable knowledge of the IPU, may I ask you another question?

I am capturing 5MP images with my camera, and I would like to switch the data path in the IPU on a frame to frame basis.

i.e. Use CSI-PRP-MEM path to reduce the frame size to 1MP and once every 10 frames, get the full 5MP image on the CSI-MEM path.

Is that possible?

And more generally - are you familiar a working example of how to use the SRM mechanism in the IPU?

Thanks a lot

Ofer

0 Kudos
58 Views
Contributor III

Hi Ofer,

I know it's an old thread, but I was wondering if you've ever succeeded in using this IPU split-mode?

0 Kudos
58 Views
Contributor III

Hi Ivan

I'm afraid I didn't succeed in doing so, but I didn't give it much attention in the end.

I managed to continue working within the IPU limitations.

Regards,

Ofer

0 Kudos
58 Views
Contributor III

Then I'll only ask this if I may:

I am currently having 1080p60 stream to CSI in YUV422. There's no way to bypass IC - I have to use color conversion to RGB, but at the output (HDMI) I am maxing out at a bit more than XGA (1024x835) @60fps - I cannot even get the 1024x1024 resolution. I am using PRP_VF case. I am also just a bit over 50Mpix/sec (IC output limit is supposed to be 100Mpix/sec). Did you have any similar issues?

Also, by now you probably know IPU like the back of your pocket - is it even possible to somehow get 1080p60 through IC, or am I dreaming?

0 Kudos
58 Views
NXP Employee
NXP Employee

Hi, Ofer,

I think it is feasible to switch CSI data path. You just need to setup CSI-PRP-MEM path to capture 10 1MP frames, stop this path and then setup CSI-MEM path to capture full 5MP image. v4l2 capture driver in mxc linux BSP can cover this use case(simply streamon, streamoff and then streamon).

Regarding SRM, the IPUv3 driver in mxc linux BSP touches that in drivers/mxc/ipu3/ipu_disp.c. This is what we've done for SRM currently, as far as I know.

Regards,

Liu Ying

0 Kudos
58 Views
Contributor III

Hi Liu Ying

Will try that, thanks!

Do I need to stop the sensor itself from streaming when I de-activate and re-activate the IPU CSI receiving path?

I'm looking for a solution which does not require such a restart of the stream, since it will consume time on the sensor itself as well...

0 Kudos
58 Views
NXP Employee
NXP Employee

Hi, Ofer,

I think that if you don't need changing the sensor output format(resolution or pixel format), you don't have to stop the sensor from streaming when you de-activate and re-activate the IPU CSI receiving path.

A possible data path for taking picture when doing preview is that:

sensor(5MP) -> csi -> mem(5MP) -> prp_vf -> mem(WVGA) -> WVGA display

                                         |

                                         |-> encoding (5MP)

0 Kudos
58 Views
Contributor III

Hi Ying Liu

I have the data path you suggested working - thanks. However, I am not satisfied with it since most of the time I only need the low-res image - which means the 5MP image is consuming memory bandwidth (and power) which potentially can be avoided.

I am now trying to change the image data path on-the-fly without loosing any frames - I modify CSI0_DATA_DEST register for switching between the CSI-IC-MEM and CSI-MEM paths, without stopping the data stream.

It appears that it is not as trivial as I thought, but I'm still working on it.

I'll also try utilize both data paths simultaniously, and use frame skipping on the CSI-MEM path to reduce the amount of unnesseccary memory transactions.

I'll appreciate your thoughts or tips on that matter...

0 Kudos
58 Views
NXP Employee
NXP Employee

Hi, Ofer,

You may try to change the image data path on-the-fly. Perhaps, the switching time point would be at the EOF(end of frame) interrupt.

Regards,

Liu Ying

0 Kudos
58 Views
Contributor IV

I think this is exactly the problem I have.

I am considering using imgconvert.c from ffmpegcolorspace

http://code.google.com/p/ossbuild/source/browse/trunk/Main/GStreamer/Source/gst-plugins-base/gst/ffm...

but I fear it will be very CPU intensive. Currently without colorspace conversion encoding uses 7% of the CPU.

Can the IC be used to do this YUV422 -> YUV420 color space conversion? I am using the IC to do resizing so it could be difficult to reconfigure it to alternately perform each task...

0 Kudos