Best way to encode OpenGL render to H264 video stream

cancel
Showing results for 
Search instead for 
Did you mean: 

Best way to encode OpenGL render to H264 video stream

3,056 Views
colindefais
Contributor III

Hello,

We need to encode the output of an OpenGL/OpenGL-ES application running on an IMX6 Quad to an h264 video stream. We would like to have the lowest latency possible with a 1280x480 resolution at 30fps.

I have found 2 threads likely speaking about this topic:

https://community.freescale.com/thread/309677#342056

https://community.freescale.com/thread/303338#312459

Those threads are speaking about using a virtual framebuffer (using the driver provided in the SDK) to render the OpenGL/OpenGL-ES application. But then we don't really understand how to read in this virtual framebuffer for encoding and, moreover, how to keep the OpenGL/OpenGL-ES application synchronized with the 30fps video encoding rate.

We would like to use the gstreamer framework if possible.

Which APIs should we use ? Is there somewhere some examples we could examine ?

Labels (3)
0 Kudos
4 Replies

552 Views
Bio_TICFSL
NXP TechSupport
NXP TechSupport

Hi Colin,

Virtual FrameBuffer is not supported any more on latest GPU drivers. However the main idea to develop Virtual FB was due to Vivante previous drivers implementation not supporting zero-copy https://www.khronos.org/registry/egl/extensions/KHR/EGL_KHR_image.txt. that is supported in latest driver (from driver version 4.6 an up).

So I would recommend to use standard methods of pbuffer or EGLImage.

You can check more information about gpu memory allocators and system virtual pools on the i.MX6 Graphics User guide under your BSP documentation folder.

Regards

0 Kudos

552 Views
coryb
Contributor III

I find myself in an almost identical situation as Colin except I am using the imx8.  I'm trying to figure out how to encode the output of an OpenGL ES application as an h264 video stream using gstreamer.  I've found how to use EGL to make a pbuffer, and I have been following tutorials to learn about creating gstreamer pipelines, but i cannot find a good explanation/example/documentation on integrating the pbuffer into the gstreamer pipeline.  Do you or anyone else know of any examples or have an idea of where to look?

0 Kudos

552 Views
colindefais
Contributor III

Thanks for the reply,

I have been looking to pbuffer or EGLImage documentation on the web and I have heard about "FrameBuffer Object" (OpenGL Frame Buffer Object (FBO) ).

It is a  better solution for my use case? If so, do you have samples for i.mx6Q?

I also wonder how to avoid copy between the OpenGL buffer and the video encoder?

0 Kudos

552 Views
colindefais
Contributor III

Thanks for the reply,

I have been looking to pbuffer or EGLImage documentation on the web and I have heard about "FrameBuffer Object" (OpenGL Frame Buffer Object (FBO) ).

It is a  better solution for my use case? If so, do you have samples for i.mx6Q?

0 Kudos