I am trying to compose capture frames with another frame, but my goal is not to display the resulting frames on screen. I am trying to send the frames via Ethernet. Since I am working on a 1920x1080 video stream, getting camera frames to user-space has a big penalty for my system. So I thought that somehow if I link capture output to IPU task's input, and link overlay frames (from /dev/fb1) to IPU task's overlay paddr, I can perform a zero-copy composition. I looked over some IPU API examples, and it seems that linking can be done by assigning physical addresses (paddr member of ipu task struct). And then I looked over V4L2 capture examples, but I couldn't find a way to access physical addresses of the capture buffers.
How can I achieve this? There are 3 ways to store capture buffers: MMAP, USERPTR and DMABUF. DMABUF method seemed the right one, but I could not find a comprehensive example to understand the method.
And more importantly, is this the right way to approach this problem? Thanks in advance.