I am developing a camera with DART-MX8M-PLUS.
I want to process capture image while streaming in QT.
Now I am doing like below.
gst-launch-1.0 v4l2src device=/dev/video1 ! imxvideoconvert_g2d ! video/x-raw,width=720, height=480 ! tee name=t t. ! queue ! waylandsink t. ! queue ! jpegenc ! multifilesink location=capture.jpg max-files=1 |
But files are being overwritten continuously and CPU load also goes up.
Are there another ways to capture at the time I want while streaming?
Thanks.
Best regards,
Jayden
Using your command you need to add: num-buffers=1
gst-launch-1.0 v4l2src num-buffers=1 device=/dev/video2 ! video/x-raw,width=3840,height=2160 ! videoconvert ! jpegenc ! multifilesink location=test1.jpg
You can only do this properly programmatically.
One solution would be to Tee in an appsink and encode a jpeg on demand.
Fakesink would also work. See example attached.
Hi malik,
Thank you so much for the good sources.
I'll test it.
Thanks.
Best regrads,
Jayden
Do you want to process each frame from camera? Why not use the opencv to capture at the time you want while streaming?
Hi Qmiller,
Then, are there examples that I can refer to using opencv?
Thanks.
Best regards,
Jayden