Hello all,
I use the latest Freescale linux version (4.1.15_1.2.0_ga) on a custom imx6q board, modeled after the SabreSD one, equipped with a 1280x1024 parallel 8-bit Bayer sensor connected to a CSI. I am able to acquire images from that sensor up to a 60 fps framerate. I'd like to convert the stream coming from that sensor into a h264 stream. I know h264 is implemented in hardware in the VPU and I can use it through the 'imxvpuenc_h264' gstreamer plugin available at GitHub - Freescale/gstreamer-imx: GStreamer 1.0 plugins for i.MX platforms . However I am still struggling with the Bayer to I420 conversion needed to feed imxvpuenc_h264. I have read on different threads here mentions of GPU 3D shader, the code at Efficient, High-Quality Bayer Demosaic Filtering on GPUs, virtual frame buffers and promising performances, but I have not found a clear description of the gstreamer pipe-line used (or any other method not involving gstreamer) for that conceptually simple and widespread task. Of course I can make a gstreamer pipeline using standard gstreamer plugins but the performances are very poor :smileysad:. Can someone enlighten me ? Thanks in advance
We have no such solution. Sorry for the inconvenience.
We can suggest to demosaic with OpenCL.
Please check this thread:
https://community.nxp.com/message/643484#comment-643484
Have a great day,
Victor
-----------------------------------------------------------------------------------------------------------------------
Note: If this post answers your question, please click the Correct Answer button. Thank you!
-----------------------------------------------------------------------------------------------------------------------
Well, we have managed to write a gst plugin using open GL shaders with the code from Efficient, High-Quality Bayer Demosaic Filtering on GPUs, but we are now struggling with extracting efficiently the RGB image from the virtual framebuffer without using glReadPixels, in order to give it to imxipuvideotransform to produce I420 for imxvpuenc_h264 input. Some forum members seem to have succeeded doing that. At the moment we have a very slow gstreamer pipe-line :
gst-launch-1.0 imxv4l2videosrc ! our-bayer2rgbGL ! imxipuvideotransform ! video/x-raw,format=I420 ! imxvpuenc_h264 ! fakesink
It is slow because our-bayer2rgbGL plugin does not use the physical address of the input buffer as input to GL, and foremost because it does not use the physical address of the virtual frame buffer as input to imxipuvideotransform. Any hint would be welcome.