How to decode H264 with libvpu and display with OpenGL ES 2.0? (using glTexDirectVIV or glTexDirectVIVMap)

キャンセル
次の結果を表示 
表示  限定  | 次の代わりに検索 
もしかして: 

How to decode H264 with libvpu and display with OpenGL ES 2.0? (using glTexDirectVIV or glTexDirectVIVMap)

3,644件の閲覧回数
jackos2500
Contributor I

Hi,

My project involves sending a raw stream of H264-encoded video over network and rendering with OpenGL ES 2.0 it on an i.MX6D based board. I started out with software decoding (using libavcodec) and uploading with glTexImage2D. I increased the efficiency of the software decoding method by using glTexDirectVIV. However, I now want to increase efficiency further by using the hardware decoding facility on the i.MX6 platform. I have read the documentation PDF for libvpu (http://hands.com/~lkcl/eoma/iMX6/VPU_API_RM_L3.0.35_1.1.0.pdf), as well as some code in another project using the library (lhttps://github.com/irtimmer/limelight-embedded/blob/master/jni/nv_imx_dec/nv_imx_dec.c). However, both the documentation and the aforementioned example only explain how to display the decoded frames using V4L, but I need to do it with OpenGL ES 2.0. Can anyone provide me with an example of this? (Preferably using glTexDirectVIV/glTexDirectVIVMap to increase efficiency)

Also, the documentation for glTexDirectVIV and glTexDirectVIVMap seems to be quite sparse. (Especially in the case of the latter.) Can someone explain to me how exactly they should be used in a scenario like this?

Thanks!

ラベル(5)
3 返答(返信)

1,740件の閲覧回数
mghasan
Contributor I

Hi Jack,

Please give more details of your setup including Operating system and your application, from where you want to use HW codecs.

0 件の賞賛
返信

1,740件の閲覧回数
jackos2500
Contributor I

Hi Mohammed,

I am using Linux (specifically Arch Linux ARM on a SolidRun HummingBoard: CuBox-i | Arch Linux ARM). Video is encoded by the hardware encoder on an Android device (non-Freescale based) and sent over a socket (one frame at a time as soon as it is encoded) to the application running on my board. Frames are then decoded using libavcodec, before being uploaded to the GPU with glTexDirectVIV and rendered with standard OpenGL ES 2.0 functions. Essentially my goal is to send a live video feed from the Android device, over a network interface, and display with as little latency as possible. I need to use OpenGL ES for rendering, as components added to my application later will need to draw locally with GPU acceleration. For clarity, I have attached some of my (very early) code.

Thanks.

0 件の賞賛
返信

1,740件の閲覧回数
CarlosCasillas
NXP Employee
NXP Employee

Hi Jack,

Have you ensured that the clip that you are trying to decode it completely supported by the H.264 levels of the i.MX6? You could take a look at the following thread:

https://community.freescale.com/thread/338468

Hope this information will be useful for you.

Best regards!

/Carlos

0 件の賞賛
返信