i.MX6 GStreamer-imx Plugins - Tutorial & Example Pipelines

cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 

i.MX6 GStreamer-imx Plugins - Tutorial & Example Pipelines

85,678 Views
ryanerb
Contributor III

The Freescale i.MX6 has many video capabilities that are best accessed through GStreamer.

Gateworks, the leading supplier of Powerful ARM based Single Board Computer solutions using the Freescale i.MX6, has invested countless engineering hours researching and mastering GStreamer for the i.MX series of processors. Gateworks would like to share this GStreamer research with the rest of the i.MX community of developers! Visit the Gateworks Software Wiki GStreamer Page.

There are two main versions of GStreamer: 0.10 and 1.0. Version 1.0 is now the latest standard. gstreamer-1.0​

The i.MX6 processor has hardware blocks such as the IPU (image processing unit), VPU (video processing unit), and GPU (graphical processing unit). The main advantage of using these hardware blocks is that there is no CPU cost for decoding/encoding a stream because another hardware block in the i.MX6 takes care of it. This leaves your CPU free to deal with other programs etc.

The GStreamer app works with 'plugins'. A plugin comprises of elements that can do work on a media stream. For example, the imxvpudec is a VPU based decoder plugin.

This post is specifically about the plugins. There are different versions and sets of plugins available.

Gateworks has chosen to use the GStreamer-imx plugins for the following reasons:

  • Open Source Development model: The project is on github and is very active
  • The main developer has been a GStreamer contributer for some time now and is very active in the GStreamer community
  • The source is very well documented and easy to follow
  • Things are done in a very standard GStreamer way

Plugin List

For a thorough description of each plugin, and why to use it, please visit the Gateworks Software Wiki GStreamer Page

TypePlugin(s)Element(s)Comments
Audio DecoderimxaudioimxuniaudiodecUses i.MX uniaudio codecs for decoding
Audio Encoderimxaudioimxmp3audioencUses i.MX for MP3 audio encoding
Device Sourcesimxv4l2videoimxv4l2videosrcGet camera source via v4l2
Video DecoderimxvpuimxvpudecVPU Based decoder
Video Encoderimxvpuimxvpuenc_mjpeg; imxvpuenc_mpeg4; imxvpuenc_h264; imxvpuenc_h263VPU Based encoders
Video Render (sink)imxg2d; imxpxp; imxeglvivsink; imxipuimxg2dvideosink; imxpxpvideosink; imxeglvivsink; imxipuvideosinkg2d1, ipu1, pxp2, and egl (overlay) video sinks
Video Converterimxg2d; imxpxp; imxipuimxg2dvideotransform; imxpxpvideotransform; imxipuvideotransformg2d, pxp, egl and ipu video filter/converter/scalars3
Video Compositingimxg2d; imxipuimxg2dcompositor, imxipucompositorgpu/ipu accelerated compositing

1. The g2d sink is very flexible in the types of input video it can take, but doesn't have the ability to convert to as many formats as the IPU can. On the other hand, the IPU is very picky with it's input (e.g. requiring a 1px offset) and the kernel driver is very undocumented, but as stated before, it can convert between many colorspace formats.

2. Note that the PXP sinks are only applicable to the i.mx6solo and i.mx6dl processors.

3. Please see note 1 above.

Plugin Example Pipeline

For example, to encode a video from a camera on /dev/video2 into h.264 and save it to a file:

#Take camera input /dev/video2, encode it to h264 at a bitrate of 10mbit/s (CBR) and save to a file.
gst-launch-1.0 imxv4l2videosrc device=/dev/video2 ! imxvpuenc_h264 bitrate=10000 ! filesink location=/tmp/file.mp4

Many more pipeline examples are described and listed on the Gateworks Software Wiki GStreamer Pipelines page

Summary

Using GStreamer 1.0 with the GStreamer-imx plugins is a powerful way to access and apply the multimedia capabilities of the Freescale i.MX6 processors!

If there are other examples you would like to see, please add to the discussion!

arpan_chakravarty

KevinWong

40 Replies

3,203 Views
farooq
Contributor II

I ran into an issue while building a custom yocto image for our development. 

 

"recipe gstreamer1.0-1.14.imx-r0: task do_fetch: Failed "
 
 
Any help will be appreciated. Thank you. 
0 Kudos

14,739 Views
wangtsungli
Contributor IV

Hi,

I'm trying to build debian rootfs with gstreamer work.

I'm using i.MX6 Quad sabreSD core and designed platform myself.

Also I was designing an application with QT to play video.

So I will need gstreamer work with QMediaplayer.

Do anyone got website or document for me to build debian with gstreamer for i.MX6 Quad Sabre-SD?

Thanks in Advanced!

0 Kudos

14,741 Views
ryanerb
Contributor III

We do have an example of QT and Gstreamer written in C++ on our wiki here: Yocto/qt – Gateworks 

0 Kudos

14,741 Views
brainmeltdown
Contributor I

I’ve run into another issue I’m hoping someone can shed light on. It might be more of a question for the GStreamer dev mailing list but perhaps someone here can shed light on it.

I’m using two pipelines – one for decoding H.264 video with imxvpudec, and the other for rendering video with imxeglvivsink. The decoder pipeline terminates in an appsink and the rendering pipeline starts with an appsrc. I’m using gst_app_sink_pull_sample () to pull samples from the appsink and gst_app_src_push_sample () to push them to the appsrc.

All works fine if the video being decoded is 30 fps. But if the video is 15 fps then it is rendered in slow motion.

If I use a single GStreamer pipeline with the same stream of H.264 video then all is well at 30 and 15 fps.

I have debugged extensively (and learned a lot more about GStreamer doing it) and have confirmed that when the video is 15 fps the GST_BUFFER_DURATION() and GST_BUFFER_PTS() show the correct values.

I have also gotten the caps from the sample and from the imxeglvivsink sink pad and both show the correct frame rate.

I have also verified that the decoded frames are delivered to the render pipeline at 15 fps (via time stamped log messages).

So clearly there is something happening under the hood with a single GStreamer pipeline that is still missing with this split pipeline scenario.

Help would be greatly appreciated. I may have to use the single pipeline if I can’t solve it but it just doesn’t fit as nicely into our existing architecture.

0 Kudos

14,740 Views
brainmeltdown
Contributor I

What should a proper pipeline look like for sending H.264 from imxvpuenc_h264 using rtph264pay, h264parse and udpsink so as to limit the MTU of packets so that there is no fragmentation?  I have coded a pipeline in C and set mtu to 1200 for rtph264pay (even though default is 1400 so shouldn't be necessary) but still get very large buffers from rtph264pay like > 100 KBytes. 

/* Link into pipeline */
gst_element_link_many (lEncoderSegP->videosrc, encoder, filtercaps, h264parser, rtp, udpsink, NULL);

Here I have set filtercaps caps property to

filt_caps264 = gst_caps_new_simple ("video/x-h264",
     "stream-format", G_TYPE_STRING, "byte-stream",
    "alignment", G_TYPE_STRING,"nal",NULL);

rtp element is of type rtph264pay and did this:

   g_object_set(rtp, "pt", 125, NULL);
   g_object_set(rtp, "mtu", 1200, NULL);  /* Shouldn't really be needed default is 1400 */

I also have encoder slice-size set

     g_object_set(encoder, "slice-size", 256*8, NULL);

Note that originally I was using a separate H.264 parser and RTP stack (not GStreamer) and for small video resolutions like 176x144 it worked but then at larger sizes the MTU started being violated.  But there is no way to set the MTU of the imxvpuenc_h264, only the slice-size. So that's why I thought I would switch to using GStreamer for RTP instead.

0 Kudos

14,740 Views
rahulshah
Contributor I

We interface OV5640 camera with imx6.s processor.

In that we have to capture images from camera and play live streaming by rotating it to 90 degree.

 

we do that with following gstreamer pipeline:

"gst-launch-1.0 imxv4l2videosrc device=/dev/video1 input=0 imx-capture-mode=4 fps-n=15 ! imxipuvideotransform  ! imxeglvivsink "

 

But video cant rotate.

There is no any rotate parameter of "imxipuvideotransform"  pluging.

So which argument we have to pass to this plugin to rotate video.

0 Kudos

14,740 Views
ssurowinski
Contributor I

It does rotate, use

videoflip method=counterclockwise

for example.

0 Kudos

14,740 Views
PaulDeMetrotion
Senior Contributor I

I have a custom kernel and device tree based on the MX6DL. I am currently trying to play video but I have a couple of issues:

  • None of the imx plug-ins are available. The gstreamer1.0-plugins-imx recipe appears to be included in my build. This is what I get when I run gst-inspect-1.0.

root@imx6dlc420:~# gst-inspect-1.0 imxvpudec

No such element or plugin 'imxvpudec'

root@imx6dlc420:~# gst-inspect-1.0 imxg2dcompositor

No such element or plugin 'imxg2dcompositor'

root@imx6dlc420:~# gst-launch-1.0  imxv4l2videosrc ! imxg2dvideosink

ERROR: pipeline could not be constructed: no element "imxv4l2videosrc".

  • Don't know if this is a problem, but I have no /dev/videox devices available. Am I missing an element in my device tree that creates these?

Thanks in advance for any assistance.

0 Kudos

14,740 Views
timharvey
Contributor IV

On Wed, Mar 30, 2016 at 7:14 AM, PaulDeMetrotion

0 Kudos

14,740 Views
PaulDeMetrotion
Senior Contributor I

Tim,

Did you provide a response that didn't stick?

0 Kudos

14,740 Views
timharvey
Contributor IV

Paul,

Yes, I tried and I guess it didn't go through - serves me right for trying to respond directly from e-mail.

You haven't provided any information on what sort of BSP/rootfs you have. The root filesystem is likely missing the gstreamer-imx package

entirely causing the missing 'imx*' gstreamer elements.

As far as missing /dev/video* that would be a different issue and could be any one of the following:

- missing device-tree nodes

- mxc_capture* drivers disabled from kernel

- mxc_capture* drivers built as modules but missing from filesystem

Again - I don't know anything about the board you are using, kernel, or BSP - you say 'custom kernel' but not necessarily why its custom.

Tim

0 Kudos

14,740 Views
PaulDeMetrotion
Senior Contributor I

Sorry for the lack of details. We have developed our own board that uses the i.MX6Q/DL/Solo processor. I am using Yocto to git and build a 3.14.28 Linux kernel provided by Freescale (NXP). I started with the Sabre-Auto board files and customized files to support our feature set. I am currently using the bitbake fsl-image-gui command to build our BSP/rootfs. Does this recipe include the gstreamer-imx package?

When I tried to add the gstreamer1.0-plugins-imx package in the conf/local.conf file, I got a 'Multiple .bb files' error. What is the specific package I need to add?

Do you need more info? Thanks for your help.

0 Kudos

14,740 Views
timharvey
Contributor IV

Paul,

As far as gstreamer-imx goes, the bitbake recipe is 'gstreamer1.0-plugins-imx' and I don't think it is included in fsl-image-gui (but I could be wrong as we use our own images). You could always 'bitbake gstreamer1.0-pluginx-imx' and install the resulting ipk from build/tmp/deploy/ipk/<machine> and you can see what packages are installed by using 'opkg list_installed'. These are more Yocto questions and are likely best resolved by jumping onto a yocto maillist or irc channel.

Note that you likely have the 'gst-fsl-plugin' (opkg list_installed | grep fsl-plugin) package installed by default which is the Freescale Gstreamer plugins which is far inferior in my opinion to the open-source community-developed gstreamer-imx plugins (which this thread is about).

Tim

0 Kudos

14,740 Views
PaulDeMetrotion
Senior Contributor I

Thanks for the advise. I fixed the missing /dev/video* entities. I had not enabled the v4l2 functionality in the kernel configuration. I now have /dev/video16 and /dev/video17 which were missing. Now to see if this fixes the video issues.

I actually tired to bitbake the gstreamer1.0-plugins-imx recipe, but got the following errors. I would assume that it is not included in fsl-image-gui since I do not get the errors.

ERROR: Multiple .bb files are due to be built which each provide virtual/libg2d (/home/pauldemet/fsl-arm-yocto-bsp/sources/meta-fsl-arm/recipes-graphics/gpu-viv-g2d/gpu-viv-g2d_3.10.17-1.0.2.bb /home/pauldemet/fsl-arm-yocto-bsp/sources/meta-fsl-bsp-release/imx/meta-fsl-arm/recipes-graphics/imx-gpu-viv/imx-gpu-viv_5.0.11.p4.4-hfp.bb).

This usually means one provides something the other doesn't and should.

ERROR: The recipe gpu-viv-g2d is trying to install files into a shared area when those files already exist. Those files and their manifest location are:

0 Kudos

14,740 Views
PaulDeMetrotion
Senior Contributor I

Tim,

Video is now working. Can play an MP4 with video and audio. I do not get audio when I play an AVI due to a missing AC3 CODEC.

I still cannot build the gstreamer1.0-plugins-imx recipe alone or as part of the fsl-image-gui recipe. Do you have any experience with the error in my previous response?

0 Kudos

14,740 Views
PaulDeMetrotion
Senior Contributor I

Do I need to change the PREFERRED_PROVIDER in my configuration file? If so, what should be the new value?

0 Kudos

14,740 Views
timharvey
Contributor IV

Paul,

I don't seem to encounter the packaging issues you are, but I'm not

necessarily building with the same Yocto configuration either. At this

point you should head over to one of the Yocto maillists or irc support

channels.

Regards,

Tim

On Fri, Apr 1, 2016 at 8:27 AM, PaulDeMetrotion <

0 Kudos

14,743 Views
tutran
Contributor I

Dear Tim,

Thanks for your support.

With the new pipeline suggestion, I can do  streaming from the camera now.

0 Kudos

14,845 Views
tutran
Contributor I

Dear Tim,

I have tried your pipeline with imxv4l2src plugin as below:

gst-launch-1.0 imxv4l2src device=/dev/video1 ! imxvpuenc_h264 bitrate=10000 ! filesink location=/tmp/file.mp4

Then I viewed the file and see that the frame size is 640x480.

I want to capture with bigger frame size, for example 1280x800

I modified the pipeline like this:

gst-launch-1.0 imxv4l2src device=/dev/video1 ! 'video/x-raw, format=(string)I420, width=(int)1280, height=(int)800, interlaced=(boolean)false, framerate=(fraction)10/1' ! imxvpuenc_h264 bitrate=10000 ! filesink location=/tmp/file.mp4

But I got the error message:

====== IMXV4L2SRC: 4.0.3 build on Mar  4 2016 15:53:15. ======

WARNING: erroneous pipeline: could not link imxv4l2src0 to imxvpuh264enc0

Could you help me with this?

0 Kudos

14,845 Views
timharvey
Contributor IV

Tu Tran,

You don't mention what board you have, but I'm going to guess its a

Gateworks GW5400 with digital HDMI video in on /dev/video0 and Analog CVBS

in on /dev/video1. In this case if you inspect capabilities of /dev/video0

via 'v4l2-ctl --deviceo=/dev/video1 --all' you will see that the only video

capture size available is 720x480 (NTSC video) (You say 640x480 but I

assume you mean 720x480).

Therefore you need to scale the video at some point in your pipeline and in

your case you would want to do that before it enters the encoder element

because it appears your intention is to create a 1280x800 H264 encoded

video. Note that you would likely be way better off encoding the image at

its native resolution and up-scaling it while playing it back as otherwise

your just using up bits on up-scaled video.

You are on the right track by using a capsfilter however your pipeline is

telling the source plugin to capture at 1280x800 which it quite simply

can't do. You need to place something in between the source element and the

encoder that is capable of image scaling and in the case of the IMX6 you

would want to use hardware-scaling provided by either the IMX6 ipu

(imxipuvideotransform), the IMX6 GPU (imxg2dvideotransform), or the IMX6

PXP (imxpxpvideotransform). Your best bet is likely imxipuvideotransform as

it supports the video format needed by the VPU encoders. Note also that if

you specify any parameter that can't be met you will get the infamous

'could not link' elements error. It's best to provide only the parameters

you absolutely need and let Gstreamer try to figure out the rest and if you

run into issues start removing features from your capsfilter until you get

a successful pipeline.

If you remove the 'interlaced=false' and 'framerate=10/1' which the

imxv4l2videosrc element doesn't support you can capture successfully:

gst-launch-1.0 imxv4l2videosrc device=/dev/video1 ! videorate !

imxipuvideotransform ! 'video/x-raw, format=(string)I420, width=(int)1280,

height=(int)800' ! imxvpuenc_h264 bitrate=10000 ! filesink

location=/tmp/file.mp4

Although the 'imxipuvideotransform' can't de-interlace via a caps-filter a

'gst-inspect-1.0 imxipuvideotransform' tells you that it indeed can

de-interlace for you by setting the 'deinterlace' property on the element.

Thus you can get closer to your original capsfilter via:

gst-launch-1.0 imxv4l2videosrc device=/dev/video1 ! videorate !

imxipuvideotransform deinterlace=true ! 'video/x-raw, format=(string)I420,

width=(int)1280, height=(int)800' ! imxvpuenc_h264 bitrate=10000 ! filesink

location=/tmp/file.mp4

Because imxv4l2videosrc does not appear to allow framerate adjustments, you

can do this via the standard gstreamer 'videorate' element but do note this

may cause some timestamp/playback issues with the h264 encoder:

gst-launch-1.0 imxv4l2videosrc device=/dev/video1 ! imxipuvideotransform !

'video/x-raw, format=(string)I420, width=(int)1280, height=(int)800' !

videorate ! 'video/x-raw, framerate=10/1' ! imxvpuenc_h264 bitrate=10000 !

filesink location=/tmp/file.mp4

For more information on the gstreamer-imx plugins you can refer to:

- http://trac.gateworks.com/wiki/Yocto/gstreamer/video - examples and

descriptions

- https://github.com/Freescale/gstreamer-imx sourcecode

Regards,

Tim

Tim Harvey - Principal Software Engineer

Gateworks Corporation - http://www.gateworks.com/

3026 S. Higuera St. San Luis Obispo CA 93401

805-781-2000

On Tue, Mar 15, 2016 at 4:05 AM, tutran <admin@community.freescale.com>