Hi all,
I recently got VPU transcoding to work on my i.MX8X-based board after some device tree changes. See the original thread here: https://community.nxp.com/t5/i-MX-Processors/i-MX8X-transcode-m-jpeg-video-to-h264-with-gstreamer-us...
I'm using a custom C0 i.MX8X board with a custom Yocto Linux based on NXP's v5.4.70_2.3.0 BSP. The gstreamer pipeline I'm currently using for the transcoding operation is the following:
gst-launch-1.0 -ve filesrc location=OBC_mjpeg.avi ! avidemux ! v4l2jpegdec ! imxvideoconvert_g2d ! v4l2convert ! v4l2h264enc ! h264parse ! mp4mux ! filesink location=OBC_h264.mp4
This works, but it takes 3 times as long as the video duration. For example, my source video is 8 seconds longs, and the transcoding process takes about 24 seconds.
Is it possible to achieve transcoding at real time (or practically real time)? If so, what would be the gstreamer pipeline that allows to reach such speeds?
Thanks in advance,
Gabriel
Hi. @joanxie
I want to convert the camera's jpeg to h264.
I have tried the following, but it does not work.
Would you please let me know how I can do it?
I tried the following command using libgstvideo4linux2.7z with C0 board with 5.4.70-2.3.0 bsp, but I get an error.
gst-launch-1.0 -v v4l2src device=/dev/video3 ! 'image/jpeg,width=1280,height=960,framerate=(fraction)30/1' ! v4l2video1jpegdec ! queue ! v4l2h264enc ! queue ! h264parse ! avimux ! filesink location=OBC_h264.avi
ERROR: from element /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: Internal data stream error.
Additional debug info:
../../../../git/libs/gst/base/gstbasesrc.c(3072): gst_base_src_loop (): /GstPipeline:pipeline0/GstV4l2Src:v4l2src0:
streaming stopped, reason not-negotiated (-4)
I also tried the following command. I got the file but I was not able to watch it as a video.
gst-launch-1.0 -v v4l2src device=/dev/video3 ! 'image/jpeg,width=1280,height=960,framerate=(fraction)30/1' ! v4l2video1jpegdec ! queue ! v4l2h264enc ! queue ! h264parse ! avimux ! filesink location=OBC_h264.avi
R&D team will check this root cause, any update, I will let you know it
Hello,
The issue was replicated with the following pipeline for avi (reducing framerate reduces the time):
gst-launch-1.0 filesrc location=sample_1920x1080.avi ! decodebin ! queue ! videorate ! "video/x-raw,framerate=20/1" ! imxvideoconvert_g2d ! queue ! v4l2convert ! queue ! v4l2h264enc ! queue ! h264parse ! mp4mux ! filesink location=test_gl.mp4
There does not seem to be a connection between transcoding time and real time. The workaround seems to be either reducing the frame rate or the size of the frame to match it with real time.
Have you tried something like this with success?
gst-transcoder-1.0 sample_1920x1080.avi test_gt.mp4 -s 1920x1080
I have tried to replicate the original pipeline and the one suggested from mp4 to matroska.
I see no difference in the output. Perhaps this is regarding the handling of the jpeg decoding from avi?
Is it possible to share the original avi this is being tested with? (also want to check with same resolution)
did you post the same question as the original one? I couldn't follow what you mention
Hello Joan,
I do not understand what you are saying.
It seems that there are performance issues with the VPU when performing transcoding from m-jpeg to h264.
I have tried transcoding from two different formats and there does not seem to be an issue. That is why that my guess is that there are issues with the original formats only.
I will also continue to check on the issues that were posted just a few hours ago