I'm using the iMX6 DualLite Sabre board. I want to render simultaneously 2x 1080p30 MJPEG video from the following webcam, Logitech C920 and C930.
I'm plugging one camera on the USB OTG connector and the other one on the miniPCIe connector. No hub in the path that could affect the bandwidth.
I'm using the following gstreamer pipeline in order to render the stream using the hardware accelerated MJPEG decoder vpudec.
gst-launch-1.0 v4l2src device=/dev/video2 ! image/jpeg, width=1920, height=1080, framerate=30/1 ! queue ! vpudec ! overlaysink overlay-width=1440 overlay-height=1080 zorder=1 sync=false \
v4l2src device=/dev/video3 ! image/jpeg, width=1920, height=1080, framerate=30/1 ! queue ! vpudec ! overlaysink overlay-left=1440 overlay-width=480 overlay-height=1080 zorder=5 sync=false
Rendering one webcam at 1080p30 at a time, there is no problems, video is fluid. Using the above pipeline which basically creates a big and small view of the webcams onscreen, I get stuttering video from both channel. The rendering is not fluid. Sometimes, the video is fluid, sometimes it is not.
Using the fpsdisplaysink filter to print stats on the displayed frames, dropped frames and current fps, I get 30fps from both cameras when rendering them simultaneously.
I tried to put the CPU at the maximum frequency 996MHz. It helped a little but not sufficent enough.
While rendering from both camera, CPU usage is at 20%, which is quite impressive.
Is it the right pipeline to use for dual rendering over the HDMI interface?
Do you have some suggestions?
Thanks for your time.