I am trying to take physical buffers from the imxvpudec gstreamer element, and modify them using the GPU with GLES and EGL, before passing them to the VPU encoder with imxvpuenc_h264.
For passing the memory used by the decoder to the GPU, I assume I can use the glTexDirectVIVMap function as here: gstreamer-imx/gles2_renderer.c at master · Freescale/gstreamer-imx · GitHub .
However, I am not so sure how to pass the memory to the encoder after having modified it with the GPU. I saw this thread: Texture input to video encoder? . However it is from 2013, so I guess some things may have changed since then.
The accepted solution in the thread was to use the virtual framebuffer from here: GitHub - Freescale/linux-module-virtfb: Virtual frame buffer driver for iMX devices . Is this still the recommended approach? In that case, I guess some updates are in order since it seg faults on initialization. My error trace is here: Segmentation fault on module initialization · Issue #2 · Freescale/linux-module-virtfb · GitHub
In the same thread, Andre Silva says that the glTexDirectVIV function can't be used to get a pointer that can be read from, so I can't pass it to the vpu encoder. However, maybe such a function has been implemented since then?