I think I've run into a limitation of the NXP BSP and would like to confirm that with you. I'm running a Torizon build based on the NXP 5.4.70-2.3.3. The board is powered by an i.MX8 QuadMax. The problem I was seeing was that I couldn't read and decode the MJPEG frames from my USB camera fast enough to produce the manufacturer advertised 1920x1080 at 60FPS.
I first confirmed the camera was not the bottleneck by testing it both on Windows and on my target platform, but without the JPEG decoding. With a bare GStreamer pipeline I was able to get the advertised 60FPS. As soon as I add v4l2jpegdec, my framerate drops to 20 FPS max. Adding a 'queue' element before it or changing the capture mode to use DMA only changed things by ~10%.
When checking the BSP release notes, I noticed that it doesn't seem like MJPEG HW decoding is not actually supported for my processor (please see below). Which would explain why when trying to use the proper element of v4l2video1jpegdec, all I was getting was a still frame.
Can someone from the NXP team please confirm this for me? Also, if it is true, are there any plans to add it or any possible workarounds (even if it means changing the processor / camera).
Solved! Go to Solution.