I am using gstreamer pipeline on i.MX8M Mini dual core platform (OS: Yocto L5.4.24-2.1.0) to capture a 5MP@20FPS camera input and put overlay on it using below command:
gst-launch-1.0 -v v4l2src device=/dev/video0 num-buffers=100 ! video/x-raw,width=2592,height=1944 ! queue ! imxvideoconvert_g2d ! video/x-raw,width=1920,height=1080 ! textoverlay text="Device name[%STATUS] reading : %%%" valignment=top halignment=left shaded-background=true ! textoverlay text="Device %Serial Number : ..%Tag location.." valignment=bottom halignment=left shaded-background=true ! clockoverlay halignment=right valignment=top shaded-background=true time-format="%H:%M:%S %D" ! vpuenc_h264 ! h264parse ! mp4mux ! filesink location=/home/root/video.mp4
Here is the expected output on the camera stream after applying the overlay.
When I use the above command, CPU utilisation increases to 120% (checked with top command) and the video is recorded at 6FPS. I have debugged that the Gstreamer overlay (which is a SW element) is causing the high CPU utilisation and subsequent FPS drop.
Based on the issue I have below queries:
1. Going through the i.MX8MM datasheet and Gstreamer documents, I saw that 8MM supports overlay through GPU. If I can use the GPU to offload the SW overlay pipeline, I believe I can get full 20FPS.
Could someone suggest how I can use the GPU overlay feature to perform overlay on the input camera stream?
2. How do I confirm that the dual cores are being utilised or not? Does i.MX8MM L5.4.24-2.1.0 support any load balacing feature ? If yes, how does the load balancing between two cores work on i.MX8MM in L5.4.24-2.1.0?
3. Are there any other module/method available (preferably a HW module) to create overlay on camera stream in i.MX8MM?