Hi. I'm using i.MX 8M Mini to record video and capture image from camera. The video must support 1920X1080 60fps. I encountered some problem.
1. If I use USB camera, we can get 1920X1080 60fps video stream. But the video format is MJPEG. So how can I use gstreamer with hardware acceleration (VPU) to transcode MJPEG to H.264 (save to file) ?
I use the following command. It seems that the jpegdec plugin is too slow! Is jpegdec hardware acceleration?
gst-launch-1.0 v4l2src device=/dev/video1 ! "image/jpeg,framerate=30/1,width=1920,height=1080" ! jpegparse ! jpegdec ! video/x-raw ! queue ! vpuenc_h264 ! filesink location=1080p.mp4
2. If I use MIPI camera, we can only get 1920X1080 30fps video stream. The video format is YUYV. So I can use the following command:
gst-launch-1.0 -e v4l2src device=/dev/video0 io-mode=4 ! video/x-raw, format=YUY2, \
width=1920,height=1080, framerate=30/1 ! tee name=t ! queue ! vpuenc_h264 ! queue ! \
h264parse ! qtmux ! filesink location=test.mp4 t. ! queue ! waylandsink
So does i.MX8M Mini MIPI supports 1920X1080 60fps YUYV camera video stream? If so, is the above command fast enough (1080P 60fps) to record the video stream to file?
If i.MX 8M Mini only supports 1920X1080 60fps MJPEG camera video stream by USB? How do I transcode MJPEG to H.264 and save to file by hardware acceleration (VPU) ?
Thank you!
已解决! 转到解答。
Hi ZHANG
for franscoding one can look at sect.7.3.4 Transcoding i.MX Linux User’s Guide
I.MX8M Mini does not support in hardware mjpeg, vpu capabilities are described in
sect.13.1.1.3 Video Processing Unit (VPU) i.MX 8M Mini Applications Processor Reference Manual
Best regards
igor
Hi ZHANG
for franscoding one can look at sect.7.3.4 Transcoding i.MX Linux User’s Guide
I.MX8M Mini does not support in hardware mjpeg, vpu capabilities are described in
sect.13.1.1.3 Video Processing Unit (VPU) i.MX 8M Mini Applications Processor Reference Manual
Best regards
igor
Hi, igor.
Thanks for your kind reply. If I didn't misunderstand, i. MX 8M Mini doesn't support transcoding 1080P 60fps MJPEG format camera video stream to H.264 in real time as there is not such hardware. But it should support transcoding 1080P 60fps YUYV format camera video stream to H.264 by VPU. The gstreamer command should like the following:
gst-launch-1.0 -e v4l2src device=/dev/video0 io-mode=4 ! video/x-raw, format=YUY2, \
width=1920,height=1080, framerate=60/1 ! tee name=t ! queue ! vpuenc_h264 ! queue ! \
h264parse ! qtmux ! filesink location=test.mp4 t. ! queue ! waylandsink
>But it should support transcoding 1080P 60fps YUYV format camera video stream to H.264 by VPU.
for such case one can look at sect.7.3.3 Video encoding i.MX Linux User’s Guide
Best regards
igor
Hi, @igorpadykov. I want to use VPU Wrapper to encode the stream from camera directly. Is there any example for that? I just found some api doc and calling sequence in i.MX VPU Application Programming Interface Linux ® Reference Manual.
Thank you.
in general one can look at vpu unit tests:
https://source.codeaurora.org/external/imx/imx-test/tree/test?h=lf-5.10.52_2.1.0
Best regards
igor