i.MX8 Video Streaming Pipeline

cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 

i.MX8 Video Streaming Pipeline

1,887 Views
alexstefan
Contributor III

Hi,

I'm a relative beginner to embedded Linux and trying to get a relatively simple (or so I thought) video processing pipeline using i.MX8 QuadMax. Right now, my processing pipeline looks something like this:

  1. OpenCV captures video from the USB video camera using v4l2
  2. A simple text overlay is added on the video
  3. The resulting image is piped as a series of images to stdout and then into ffmpeg
  4. Ffmpeg encodes it and streams it to YouTube

The command being:

./apalis-opencv-test | ffmpeg -thread_queue_size 1024 -i pipe: -f lavfi -i anullsrc -f flv -c:v libx264 -c:a aac -preset ultrafast rtmp://a.rtmp.youtube.com/live2

 

My main issue is that I only get ~18FPS which seems really slow for a chip that can do HW encoding. I'm hoping someone can point me in the right direction of at least what to test next. 

Thanks!

Tags (4)
0 Kudos
7 Replies

1,823 Views
alexstefan
Contributor III

Following advice received on the GStreamer mailing list, I focussed on the CAPS negotiation between the different elements. This looks like the JPEG coming from the camera has a different format than the JPEG decoder expects.

0 Kudos

1,843 Views
malik_cisse
Senior Contributor I

Also I do not recommand gstreamer debug level 4. It is too verbose. Usually 3 will do.

1,872 Views
malik_cisse
Senior Contributor I

Hi,

ffmpeg's libx264 is using software encoding thus the poor performance.

You should rather use gstreamer pipeline with hardware h264 encoding (h264_enc element).

Gstreamer has all function you are interested in and is the native multimedia framework on i.MX8 QuadMax.

you can do

gst-inspect-1.0 | grep 264 to check exact name.

1,857 Views
alexstefan
Contributor III

Thank you! Have set up GStreamer and run a basic pipeline to confirm it's able to encode to H264 and stream to YouTube. What I haven't been able though is confirm the FPS of this pipeline, to make sure it's indeed achieving the performance required.

gst-launch-1.0 -v videotestsrc pattern=ball is-live=true ! video/x-raw,width=640,height=480,framerate=30/1 !  v4l2h264enc ! h264parse     ! flvmux name=mux     audiotestsrc ! queue ! audioconvert ! avenc_aac ! aacparse  ! mux. mux. ! tee name=t t. ! rtmpsink location="rtmp://a.rtmp.youtube.com/live2/"

So that would be problem #1.

Problem #2 is being able to take the video from my USB camera, that outputs MJPEG and stream that to YouTube. I'm getting stuck passing the frames to the v4l2h264enc element. I've attached the output from running the pipeline

GST_DEBUG=4 gst-launch-1.0 -v v4l2src device=/dev/video2 ! v4l2jpegdec ! v4l2h264enc ! v4l2h264dec ! autovideosink

 

 

 

0 Kudos

1,844 Views
malik_cisse
Senior Contributor I

Hi Alex,

To #1, there is Gstreamer fpsdisplaysink element that can display FPS info:
https://gstreamer.freedesktop.org/documentation/debugutilsbad/fpsdisplaysink.html?gi-language=c

 

To #2, usually you would need videoconvert element between decoder and encoder. Also you might rather need MJPEG decoder than JPEG
Here is an example I had working on Jetson Nano Platform. Note however that avdec and x264 enc are both software codes and not suitable in your case. This only give a general idea of the pipeline elements:

# x264enc jpeg from Logitech C920 USB cam
gst-launch-1.0 rtpbin name=rtpbin ! v4l2src device=/dev/video1 ! image/jpeg,width=1920,height=1080,framerate=30/1 ! avdec_mjpeg ! videoconvert ! x264enc name=video tune=zerolatency ! rtph264pay ! gscreamtx media-src=0 ! rtpbin.send_rtp_sink_0 rtpbin.send_rtp_src_0 ! udpsink host=$RECEIVER_IP port=5000 bind-port=5000 rtpbin.send_rtcp_src_0 ! udpsink host=$RECEIVER_IP bind-port=5001 port=5001 udpsrc port=5001 ! rtpbin.recv_rtcp_sink_0



0 Kudos

1,793 Views
alexstefan
Contributor III

I'm pretty sure all my issues come from the fact that there is actually no support in the BSP for MJPEG  HW decoding for the i.MX8 QuadMax. I've detailed in this post my research process:

 

https://community.nxp.com/t5/i-MX-Processors/i-MX8-QuadMax-MJPEG-VPU-Decoding/td-p/1487087

0 Kudos

1,840 Views
alexstefan
Contributor III

Thank you for the reply! I only have a v4l2videojpegdec, but that only produces a still frame. Have tried a videoconvert element between the decoder and encoder, same problem. The only element that produced some sort of a running pipeline was a 'decodebin', but that only produced green artifacts. 

Will try a variation of your pipeline and report back.