Hi Mohammed,
I am using Linux (specifically Arch Linux ARM on a SolidRun HummingBoard: CuBox-i | Arch Linux ARM). Video is encoded by the hardware encoder on an Android device (non-Freescale based) and sent over a socket (one frame at a time as soon as it is encoded) to the application running on my board. Frames are then decoded using libavcodec, before being uploaded to the GPU with glTexDirectVIV and rendered with standard OpenGL ES 2.0 functions. Essentially my goal is to send a live video feed from the Android device, over a network interface, and display with as little latency as possible. I need to use OpenGL ES for rendering, as components added to my application later will need to draw locally with GPU acceleration. For clarity, I have attached some of my (very early) code.
Thanks.