Hi,
I am experiencing slow decoding and low streaming quality when trying to stream out video from i.mx6 quad core. I am wondering if I matched the pipeline wrongly.
The following the is pipeline i used
(receiver on desktop connected via ethernet to i.mx6)
1.gst-launch udpsrc caps=" application/x-rtp,media=(string)video,clock-rate=(int)90000,encoding-name=(string)H264,payload=(int)96,ssrc=(uint)2674837201,clock-base=(uint)2959668548,seqnum-base=(uint)14300" port=5000 ! rtph264depay ! decodebin2 ! d3dvideosink
(sender from i.mx6)
//sending using test pattern
1.) gst-launch videotestsrc ! x264enc ! rtph264pay ! udpsink host=192.168.0.100 port=5000 -v
//sending using filesrc with a sample video file
2.) gst-launch filesrc location=Trailer.mp4 ! decodebin2 ! x264enc ! rtph264pay ! udpsink host=192.168.0.100 port=5000
3.) gst-launch filesrc location=Trailer.mp4 ! decodebin2 ! vpuenc codec=6 ! queue ! rtph264pay ! udpsink host=192.168.0.100 port=5000
I attached the log from the receiver end
(1.)log_videotestsrc.txt
(2.)log_filesrc.txt
(3.)log_filesrc_vpuenc.txt
The following is one of the common message i see at the receiver end:
WARNING: from element /GstPipeline:pipeline0/GstD3DVideoSink:d3dvideosink0: A lot of buffers are being dropped.
Additional debug info:
gstbasesink.c(2873): gst_base_sink_is_too_late (): /GstPipeline:pipeline0/GstD3DVideoSink:d3dvideosink0:
There may be a timestamping problem, or this computer is too slow.
At Receiver end:
1.) the test pattern is not stable throughout, it seems to be dropping packets
2.) the sample video take a long time to start decode and did not manage to complete. ( stop right after few seconds of display w/distortion)
3.) the sample video managed to decode with distortion and delay.
I am wondering what would be a better pipeline to use to reduce the latency.
I recently also start using libmfw_gst_tvsrc.so to grab the tvin source, I would like to find out how can the raw video be scaled as I did not see the caps options for tvsrc, I tried gst-launch tvsrc ! video/x-raw-yuv, width=720, height=480 ! mfw_ipucsc ! video/x-raw-yuv, width=360, height=240 ! vpuenc codec=6 ! queue ! rtph264pay ! udpsink host=192.168.0.100 port=5000 but did not work.
Thanks and Regards,
Eric
Original Attachment has been moved to: log_videotestsrc.txt.zip
Original Attachment has been moved to: log_filesrc.txt.zip
Original Attachment has been moved to: log_filesrc_vpuenc.txt.zip
Try tvsrc with 720x576.
Hi Robbie,
The camera I am using is NTSC not PAL but thanks for the suggestion anyway. :smileyhappy:
Hi swl,
My experience is that tvsrc can only work with a resolution equal to the standard size of NTSC/PAL.
You can check the standard size of NTSC/PAL in mxc_v4l2_capture.c.
Can you give me a sample that how to use tvsrc (gstreamer element ) to streaming a video on the network
Plz refer my post at:
How could you get the mfw_deinterlacer element, i don't have it?
And if i use the follow cmd:
gst-launch -v tvsrc device=/dev/video0 ! 'video/x-raw-yuv,format=(fourcc)UYVY,width=720,height=576,framerate=15/1' ! mfw_ipucsc ! 'video/x-raw-yuv,format=(fourcc)I420, width=320,height=240,framerate=15/1'! vpuenc codec=avc gopsize=2 ! video/x-h264,width=320,height=240 ! rtph264pay ! udpsink host=192.168.1.108 port=5000
I get the follow err info:
WARNING: erroneous pipeline: could not link mfwgsttvsrc0 to mfwgstipucsc0
Hi Robbie,
Thanks again for the hint, but I was wondering if it is possible to do scaling of the raw video with tvsrc in an attempt to reduce the resolution before encoding the video.
Regards,
Hi swl,
The only place where you can do the scaling is in the IPU, so in order to do the change scaling before the scaling you need to start a task something like
this:
gst-launch v4l2src ! video/x-raw-yuv,width=320,height=240 ! mfw_isink disp-width=240 disp-height=120 axis-left=1200 axis-top=500 rotation=3