Trying to encode H.264 files.

cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 

Trying to encode H.264 files.

Jump to solution
2,752 Views
ricardo_ioct
Contributor III

Greetings,

My name is Ricardo, and recently I'm working on a board based on a i.MX53 module. I'm using the 11.09 BSP and IMX_MMCODECS releases. As says the title, I'm trying to encode H.264 files from a USB camera using the VPU-based gstreamer plugins. Through several researches and a lot of pipelines, we discovered some issues:

1. The USB camera provides jpeg data(image/jpeg), so we couldn't use mfw_v4lsrc, that is for data in raw format(video/x-raw-yuv).

2.  In order to encode, we need first to decode to raw video and change the PixelFormat(Y42B to NV12). This is a important thing to do, because, despite the information of gst-inspect, the mfw_vpuencoder doesn't work with Y42B pixelformat.

3. After that, we encode to H.264 using mfw_vpuencoder.

This is the working pipeline:

gst-launch v4l2src device=/dev/video0  ! image/jpeg,width=1280,height=720,framerate=15/1 ! mfw_vpudecoder codec_type=std_mjpg loopback=true ! video/x-raw-yuv,format=\(fourcc\)Y42B ! ffmpegcolorspace ! video/x-raw-yuv,format=\(fourcc\)NV12  ! mfw_vpuencoder loopback=true codec-type=std_avc ! avimux ! filesink location=teste.avi

Although it works, it's not the better way. Because we need to use the ffmpegcolorspace plugin, the CPU usage reaches 80%. We tried to use the IPU Colorspace Plugin (mfw_ipucsc), but it it doesn't accept Y42B frames as input. We tried several ways to force the VPU decoder to output frames in I420 or NV12 format, but we failed all the times. Our approaches:

1. Set the fmt parameter of mfw_vpudecoder to 1. It supposed to generate I420 formats;

2. Put a caps after the decoder;

3. Instead of use mfw_ipucsc, use mfw_deinterlacer. Doesn't work too.

Some pipelines and its debugs:

gst-launch -vvv v4l2src device=/dev/video0 ! image/jpeg,width=1280,height=720,framerate=15/1 ! mfw_vpudecoder codec_type=std_mjpg ! mfw_deinterlacer chrom-fmt=0 ! fakesink

Setting pipeline to PAUSED ...

[INFO]  Product Info: i.MX53

VPU Version: firmware 13.4.32; libvpu: 5.1.4

MFW_GST_VPU_DECODER_PLUGIN 2.0.3 build on Jul 13 2012 06:30:57.

/GstPipeline:pipeline0/GstV4l2Src:v4l2src0.GstPad:src: caps = image/jpeg, width=(int)1280, height=(int)720, framerate=(fraction)15/1

Pipeline is live and does not need PREROLL ...

Setting pipeline to PLAYING ...

New clock: GstSystemClock

/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = image/jpeg, width=(int)1280, height=(int)720, framerate=(fraction)15/1

/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:sink: caps = image/jpeg, width=(int)1280, height=(int)720, framerate=(fraction)15/1

/GstPipeline:pipeline0/MfwGstVPU_Dec:mfwgstvpu_dec0.GstPad:sink: caps = image/jpeg, width=(int)1280, height=(int)720, framerate=(fraction)15/1

/GstPipeline:pipeline0/MfwGstVPU_Dec:mfwgstvpu_dec0.GstPad:src: caps = video/x-raw-yuv, format=(fourcc)Y42B, width=(int)1280, height=(int)720, width_align=(int)16, height_align=(int)16,0

/GstPipeline:pipeline0/MfwGstVPU_Dec:mfwgstvpu_dec0.GstPad:src: caps = video/x-raw-yuv, format=(fourcc)Y42B, width=(int)1280, height=(int)720, width_align=(int)16, height_align=(int)16,0

Deinterlacer default input format is I420

Deinterlacer default input format is I420

/GstPipeline:pipeline0/MfwGstDeinterlace:mfwgstdeinterlace0.GstPad:src: caps = video/x-raw-yuv, format=(fourcc)Y42B, width=(int)1280, height=(int)720, width_align=(int)16, height_align=0

/GstPipeline:pipeline0/MfwGstDeinterlace:mfwgstdeinterlace0.GstPad:src: caps = video/x-raw-yuv, format=(fourcc)Y42B, width=(int)1280, height=(int)720, width_align=(int)16, height_align=0

/GstPipeline:pipeline0/MfwGstDeinterlace:mfwgstdeinterlace0.GstPad:sink: caps = video/x-raw-yuv, format=(fourcc)Y42B, width=(int)1280, height=(int)720, width_align=(int)16, height_align0

gst-launch -vvv v4l2src device=/dev/video0 ! image/jpeg,width=1280,height=720,framerate=15/1 ! mfw_vpudecoder codec_type=std_mjpg fmt=1 ! mfw_ipucsc ! fakesink

IPU_CSC_CORE_LIBRARY_VERSION_INFOR_01.00.

MFW_GST_IPU_CSC_PLUGIN 2.0.3 build on Jul 13 2012 06:30:48.

Setting pipeline to PAUSED ...

[INFO]  Product Info: i.MX53

VPU Version: firmware 13.4.32; libvpu: 5.1.4

MFW_GST_VPU_DECODER_PLUGIN 2.0.3 build on Jul 13 2012 06:30:57.

/GstPipeline:pipeline0/GstV4l2Src:v4l2src0.GstPad:src: caps = image/jpeg, width=(int)1280, height=(int)720, framerate=(fraction)15/1

Pipeline is live and does not need PREROLL ...

Setting pipeline to PLAYING ...

New clock: GstSystemClock

/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = image/jpeg, width=(int)1280, height=(int)720, framerate=(fraction)15/1

/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:sink: caps = image/jpeg, width=(int)1280, height=(int)720, framerate=(fraction)15/1

/GstPipeline:pipeline0/MfwGstVPU_Dec:mfwgstvpu_dec0.GstPad:sink: caps = image/jpeg, width=(int)1280, height=(int)720, framerate=(fraction)15/1

/GstPipeline:pipeline0/MfwGstVPU_Dec:mfwgstvpu_dec0.GstPad:src: caps = video/x-raw-yuv, format=(fourcc)Y42B, width=(int)1280, height=(int)720, width_align=(int)16, height_align=(int)16,0

ERROR: from element /GstPipeline:pipeline0/MfwGstVPU_Dec:mfwgstvpu_dec0: fatal error

Additional debug info:

Allocation of the Frame Buffers Failed

Execution ended after 11049559049 ns.

Setting pipeline to PAUSED ...

Setting pipeline to READY ...

/GstPipeline:pipeline0/MfwGstVPU_Dec:mfwgstvpu_dec0.GstPad:src: caps = NULL

/GstPipeline:pipeline0/MfwGstVPU_Dec:mfwgstvpu_dec0.GstPad:sink: caps = NULL

/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = NULL

/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:sink: caps = NULL

/GstPipeline:pipeline0/GstV4l2Src:v4l2src0.GstPad:src: caps = NULL

Setting pipeline to NULL ...

>>VPU_DEC: State: Ready to Null

Freeing pipeline ...

[--->FINALIZE vpu_dec

As you can see, I can't generate another format different of Y42B.

Somebody knows a better way to link these plugins?

Thanks in advance,

Ricardo Gurgel.

Labels (3)
Tags (1)
0 Kudos
1 Solution
1,629 Views
FranciscoCarril
Contributor V

gst-launch -vvv v4l2src ! fakesink

Setting pipeline to PAUSED ...

/GstPipeline:pipeline0/GstV4l2Src:v4l2src0.GstPad:src: caps = video/x-raw-rgb, bpp=(int)24, depth=(int)24, red_mask=(int)255, green_mask=(int)65280, blue_mask=(int)16711680, endianness=(int)4321, width=(int)1280, height=(int)800, framerate=(fraction)10/1

Pipeline is live and does not need PREROLL ...

Setting pipeline to PLAYING ...

New clock: GstSystemClock

View solution in original post

0 Kudos
9 Replies
1,629 Views
ricardo_ioct
Contributor III

Great, it's very good to know about these cameras. Just to confirm, do you ran these pipelines on i.MX53 QSB or on PC? Francisco, I'd like to thank you again by your test, it will help me a lot.

0 Kudos
1,629 Views
FranciscoCarril
Contributor V

Hello!  you are welcome.

Every pipeline was executed on i.MX53 QSB running ubuntu demo image. So it can be replicated in your board.

Francisco

0 Kudos
1,629 Views
ricardo_ioct
Contributor III

Interesting. As default, the camera provides datastream on 'video/x-raw-rgb' format, but, if necessary, are you able to generate 'video/x-raw-yuv' datastream? It could be a pipeline like this:

gst-launch v4l2src ! 'video/x-raw-yuv,width=1280,height=720,framerate=15/1' ! fakesink ...

Thank you.

0 Kudos
1,629 Views
FranciscoCarril
Contributor V

gst-launch -vvv v4l2src ! 'video/x-raw-yuv,width=1280,height=720,framerate=15/1' ! fakesink

Setting pipeline to PAUSED ...

/GstPipeline:pipeline0/GstV4l2Src:v4l2src0.GstPad:src: caps = video/x-raw-yuv, format=(fourcc)YV12, width=(int)1280, height=(int)720, framerate=(fraction)15/1

Pipeline is live and does not need PREROLL ...

Setting pipeline to PLAYING ...


0 Kudos
1,629 Views
FranciscoCarril
Contributor V

What model is your Camera?

We have an HD microsoft camera and it gives us YUV format with very good results

Have you considered using other camera?

1,629 Views
ricardo_ioct
Contributor III

Hi,

Yes, we might change our camera, if it works. We're using a USB camera based on a OV519 chip, that generates MJPEG datastream. What camera are you using? It's USB?

Att.,

0 Kudos
1,629 Views
FranciscoCarril
Contributor V

We have a Microsoft lifeCam usb camera,  we have tested it with i.MX53 and i.MX6

1,629 Views
ricardo_ioct
Contributor III

Thank you! It would be very helpful if you could ran this pipeline for me and post the output:

gst-launch -vvv v4l2src ! fakesink

The messages after "New clock: GstSystemClock" doesn't matter. After that, you can kill the pipeline.

Again, thank you for the answers.

0 Kudos
1,630 Views
FranciscoCarril
Contributor V

gst-launch -vvv v4l2src ! fakesink

Setting pipeline to PAUSED ...

/GstPipeline:pipeline0/GstV4l2Src:v4l2src0.GstPad:src: caps = video/x-raw-rgb, bpp=(int)24, depth=(int)24, red_mask=(int)255, green_mask=(int)65280, blue_mask=(int)16711680, endianness=(int)4321, width=(int)1280, height=(int)800, framerate=(fraction)10/1

Pipeline is live and does not need PREROLL ...

Setting pipeline to PLAYING ...

New clock: GstSystemClock

0 Kudos