AnsweredAssumed Answered

Texture input to video encoder?

Question asked by kalle on Jul 15, 2013
Latest reply on Jan 13, 2014 by kalle

Is it possible to use buffer from GPU memory on i.MX6 as an input to a hardware video encoder? I would prefer h.264 encoder, but any other will do, too.

 

My use case is the following. I get a frame from a USB camera, but that image uses Bayer tiling for color information, which looks like image no. 2 at https://en.wikipedia.org/wiki/File:Colorful_spring_garden_Bayer.png . I need to convert it to a suitable format before I can feed it to video encoder (most probably YUV420). This conversion can be easily done using OpenGL ES shaders. Using the shaders, I end up with an OpenGL texture that needs to be transferred to video encoder. I could download the texture pixels to CPU and then again upload it to GPU for encoding, but this drops the framerate significantly. Instead, I would like to tell the video encoder to use the existing texture in GPU memory.

 

Can I use the pointer from eglQueryImageFSL as an input to video encoder, or will the image still go through the CPU?

 

Is UseEGLImage() implemented in the OpenMAX components of the video encoders? This is impossible to tell just looking at the headers, and I have not bought an i.MX6 board yet, as I do not know if this crucial feature is supported.

 

In case directly passing memory from one GPU task to another is not possible, could someone tell me how long does it take to download one full HD RGB image from the GPU and how much CPU resource it will take?

Outcomes