IMX8 Video overlay with graphics before H264 encode

cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 

IMX8 Video overlay with graphics before H264 encode

Jump to solution
5,595 Views
TerryBarnaby1
Contributor IV

Hi we are designing a visual inspection system using a 1920x1080p30 video camera.

We need to overlay graphics onto the video stream prior to H264 encode as well as to the display on a low power system. I am not finding any information on how to do this with a SOC like an iMX8M-Plus. The iMX6 had an IPU that could compose/overlay graphics onto the video stream although at a max resolution of 1024x1024 (could do this 4 x for 1920x1080). The iMX8M family seems to have lost this aspect of video processing.

So I guess the 2D or 3D GPU would have to do this composition/overlay to memory while also performing normal display work. It appears there are no gstreamer plugins to handle this so I would have to create something. However at this stage I want to know:

1. Would the iMX8-Plus hardware be capable of this with minimal CPU usage (camera would be YUV I422/I421 and composer/overlay output needs to be fed to the VPU encoder (NV12 ?) as well as to the display) ?

2. Any ideas on how to go about this ? We would have to use the GPU whilst the normal X11/Wayland graphics GUI is in use and feed the data through zero copy dma buffer pipeline etc.

Terry

0 Kudos
1 Solution
5,553 Views
Zhiming_Liu
NXP TechSupport
NXP TechSupport

Hi

 

Yes,imx-gst1.0-plugin has replaced the gstreamer1.0-plugins-imx.

In i.MX6 Family :Video render relys on G2D, not DRI

In i.MX8 Family:Video render relys on DRI,due to Wayland using DRI

View solution in original post

0 Kudos
16 Replies
5,582 Views
TerryBarnaby1
Contributor IV

Many thanks for the info. I would like to try this to prove that the output of this can be taken through the H264 VPU encoder and the overall CPU usage of this. However where and how would I get the gstreamer imxcompositor_g2d plugin ? I have been trying to build a suitable community or nxp based Yocto build for the iMX6 and iMX8 test platforms I have but have never seen this plugin ? The NXP gstreamer plugins seem to be in quite a mess at the moment.

Should the user level part of this be in the gstreamer1.0-plugins-imx package ?

If I build a IMX community Yocto release using: "https://github.com/Freescale/fsl-community-bsp-platform -b dunfell" this cannot build a gstreamer1.0-plugins-imx package.

If I build a IMX NXP Yocto release using: "https://source.codeaurora.org/external/imx/imx-manifest -b imx-linux-zeus -m imx-5.4.70-2.3.0.xml" for a imx8mqevk machine there is no imxcompositor_g2d plugin as far as I can seen in the build (cannot run it as I don't have a imx8mqevk) and if I add "gstreamer1.0-plugins-imx" to the build (IMAGE_INSTALL_append) I get the error "imx8mqevk (not in COMPATIBLE_MACHINE)".

So what should I build to get a suitable Yocto Linux system that will provide a working imxcompositor_g2d gstreamer plugin ?

0 Kudos
5,576 Views
Zhiming_Liu
NXP TechSupport
NXP TechSupport

Hi

 

The i.MX8MQ doesn't have 2D GPU,so you can not use imxcompositor_g2d,you can try another imxcompositor

 

BR

0 Kudos
5,570 Views
TerryBarnaby1
Contributor IV

Thanks for the information, I had just worked out that the imx8m didn't support some video processing but I didn't release it didn't have a 2D GPU! This iMX8 range is very messy and has strange part naming!

However the basic question remains, we are not targeting the imx8mq for our design, we will probably be using a IMX8M-Plus or IMX8-DualXPlus. What I am trying to determine is how to build a suitable Yocto release with support for hardware video compositing before H264 in general.

The software support has changed a lot over the last 5 years and there are many routes to producing a Yocto release and many graphics and video processing driver changes in the kernel and outside etc. There doesn't appear to be a simple overall guide on these changes and where the direction is going and what user level API and what kernel and kernel level API's are needed. So I'm trying to understand this so we can first test or video processing concept and then design and board and produce a suitable Yocto build for this custom board.

For example am I right in thinking that the imx-gst1.0-plugin package has replaced the gstreamer1.0-plugins-imx package and when and what lower level API's/drivers does it need/support (DRI ?). I would like to build systems for iMX6 and iMX8M-Plus that are compatible but I'm not sure this is possible.

Terry

0 Kudos
5,564 Views
Zhiming_Liu
NXP TechSupport
NXP TechSupport

Hi

 

All bsp release note and the details about i.MX GStreamer 1.0 plugins  are in i.MX_Linux_Release_Notes.pdf (Multimedia)

i.MX_Linux_Users_Guide.pdf talks about how to do tests.

About the last point you mentioned, the release note has said:

i.MX 6 Family:
• overlaysink : G2D-based video sink plugin
• imxv4l2sink: V4L2-based video sink plugin

 

0 Kudos
5,560 Views
TerryBarnaby1
Contributor IV

Thanks for the info, but that is specific information to a specific NXP BSP release.

I need to produce a Yocto release for a custom board and thus need to configure a Yocto build for this to get the most current and up-to-date video processing gstreamer elements. The NXP "Yocto" builds are different from the community builds (which are more mainline) and there is the BSP type and the three different kernel variants and then all of the paraphernalia of different kernel driver configurations etc. Aslo old Yocto builds have different gstreamer elements and drivers etc.

I am just trying to understand what the current/planned underlying video processing software tree is, so I follow this in our builds. It has been stated to use the community Yocto BSP for custom boards but this is quite different to the NXP Yocto wrt video processing. For example at the user level there is the gstreamer1.0-plugins-imx and imx-gst1.0-plugin package both of which do not build with a community based Yocto dunfell build for an IMX6DL.

Am I right in thinking that the imx-gst1.0-plugin package has taken over from the gstreamer1.0-plugins-imx for all IMX variants and does this rely on the DRI GPU interface route ?

I just want to understand the overall hardware video processing software/hardware architecture/packages and its history so I can develop in the most sensible way.

Terry

0 Kudos
5,554 Views
Zhiming_Liu
NXP TechSupport
NXP TechSupport

Hi

 

Yes,imx-gst1.0-plugin has replaced the gstreamer1.0-plugins-imx.

In i.MX6 Family :Video render relys on G2D, not DRI

In i.MX8 Family:Video render relys on DRI,due to Wayland using DRI

0 Kudos
5,552 Views
TerryBarnaby1
Contributor IV

Many thanks.

It would be nice to have an overview of the video processing architecture (hardware and software layers/API's and packages) and the history in the "i.MX Linux® User's Guide" and how this is supported in the NXP Yocto BSP and community Yocto BSP.

0 Kudos
5,590 Views
Zhiming_Liu
NXP TechSupport
NXP TechSupport

Hi

 

1.YUV422 and YUV420 input and output formats are also supported, supports encoder (NV12).

2.You can see 7.3.16 Video composition in i.MX_Linux_User's_Guide.pdf
imxcompositor_g2d uses corresponding hardware to accelerate video composition. It can be used to composite multiple
videos into one. The video position, size, and rotation can be specified while composition. Video color space conversion is
also performed automatically if input and output video are not same. Each video can be set to an alpha and z-order value to
get alpha blending and video blending sequence.

0 Kudos
4,771 Views
NagendraB
Contributor II

Hello 

I am working with similar requirements , I would like to understand how to test video overlays on imx8m plus platform, unfortunately the doc Linux user guide  dated L5.4.70_2.3.0, 13 January 2021 does not have imx8m-plus specific info and the plugin overlaysink is not available.

0 Kudos
4,765 Views
TerryBarnaby1
Contributor IV

With the current (NXP Yocto hardknott) based Linux I have found you can to use the graphics 2D engine to perform the overlay processing. The gstreamed imxcompositor_g2d module can do this. A typical gstreamer command could be like:

gst-launch-1.0 imxcompositor_g2d name=c sink_1::alpha=0.5 ! waylandsink v4l2src device=/dev/video3 ! video/x-raw,width=1920,height=1080,framerate=30/1 ! c.sink_0 multifilesrc location=./test1.png caps=image/png,framerate=1/1 ! pngdec ! imagefreeze ! c.sink_1

This will take a full HD camera stream and overlay the test1.png file on it. Following the imxcompositor_g2d you can pass this video stream through "vpuenc_h264 ! h264parse ! avimux ! filesink location=/tmp/temp.avi" to encode using the hardware h264 encoder possibly teeing it to the display as well as the h264 encoder if needed.

0 Kudos
4,757 Views
NagendraB
Contributor II

Hello Terry

Thanks for the inputs, If I want to blend video  layer with wayland GUI layer is it possible with similar approach ?

 



0 Kudos
4,745 Views
TerryBarnaby1
Contributor IV

I guess that depends on exactly what you want to do. You will need to understand the iMX8mp hardware, the implemented gstreamer modules, the wayland server code, your GUI toolkit etc. info on which is lacking and I'm no expert. And it also depends on what you mean by Wayland GUI layer.

In my case I needed to overlay some text on the video stream prior to displaying it in a Qt window as well as h264 encoding it to a file and I am using the Qt GUI platform. In my code the graphical "layer" (which comes from test1.png in my example) that is overlaid on the video is generated in a Qt QImage using normal Qt drawing primitives and the gstreamer "appsrc name=\"appsrc\" ! videoconvert ! imagefreeze ! c.sink_1" partial pipeline is used to feed this into the imxcompositor_g2d. The screen output is "waylandsink name=\"videoSink\"" and there is some C++ GUI code to move and size this walyandsink area over a particular Qt window in my application.

0 Kudos
4,638 Views
NagendraB
Contributor II

hello Terry
In my case,  consider use case like simple wayland application with a video window inside it with a position and resolution configured , what could be the best approach ?should I use multiple egl surfaces ? or wayland sub-surfaces ? could you suggest which approach would be better?

0 Kudos
4,733 Views
NagendraB
Contributor II

Hello 

consider  simple wayland application which run multiple videos in it

And another point , I am trying to use alpha property of waylandsink (imx8m-plus) which is not working, any limitation here ?

0 Kudos
4,682 Views
NagendraB
Contributor II

hello NXP

the alpha property of waylandsink is not working (5.4.70- imx8mplus) , is there any limitation? 

0 Kudos
4,650 Views
NagendraB
Contributor II

hello 
Any inputs on my above query ? is waylandsink has limitation with respect to alpha property ?
Also any Idea how to use chromakey  property  using GPU/VPU ? 

0 Kudos