Network Streaming a file using gst-launch (gstreamer)

cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 

Network Streaming a file using gst-launch (gstreamer)

Jump to solution
9,777 Views
smwsmart-e
Contributor II

Hi all, I'm quite new to Gstreamer but based on the Gstreamer Streaming doc (https://community.freescale.com/docs/DOC-94646) I've created pipelines that stream to/from the board and work well. I'm then trying to use the same pipeline with a filesrc instead of the videotestsrc, this is where the problem starts.

For streaming out of the board I'm using:

gst-launch -ve gstrtpbin name=rtpbin filesrc location=/home/linaro/Desktop/foreman_cif.yuv ! vpuenc codec=6 ! queue ! rtph264pay ! rtpbin.send_rtp_sink_0 rtpbin.send_rtp_src_0 ! queue ! udpsink host=192.168.1.29 port=5000 sync=false rtpbin.send_rtcp_src_0 ! udpsink host=192.168.1.29 port=5001 sync=false async=false udpsrc port=5005 ! rtpbin.recv_rtcp_sink_0

This does nothing on the recieving computer, the terminal output is below:

Setting pipeline to PAUSED ...

[INFO]  Product Info: i.MX6Q/D/S

vpuenc versions: )

        plugin: 3.0.7

        wrapper: 1.0.35(VPUWRAPPER_ARM_LINUX Build on Apr 18 2013 23:02:29)

        vpulib: 5.4.12

        firmware: 2.1.9.36350

Pipeline is live and does not need PREROLL ...

Setting pipeline to PLAYING ...

New clock: GstSystemClock

/GstPipeline:pipeline0/GstRtpBin:rtpbin.GstGhostPad:send_rtcp_src_0: caps = application/x-rtcp

/GstPipeline:pipeline0/GstRtpBin:rtpbin/GstRtpSession:rtpsession0.GstPad:send_rtcp_src: caps = application/x-rtcp

/GstPipeline:pipeline0/GstUDPSink:udpsink1.GstPad:sink: caps = application/x-rtcp

/GstPipeline:pipeline0/GstRtpBin:rtpbin.GstGhostPad:send_rtcp_src_0.GstProxyPad:proxypad3: caps = application/x-rtcp

Got EOS from element "pipeline0".

Execution ended after 812999 ns.

Setting pipeline to PAUSED ...

Setting pipeline to READY ...

/GstPipeline:pipeline0/GstUDPSink:udpsink1.GstPad:sink: caps = NULL

/GstPipeline:pipeline0/GstRtpBin:rtpbin.GstGhostPad:send_rtcp_src_0: caps = NULL

/GstPipeline:pipeline0/GstRtpBin:rtpbin/GstRtpSession:rtpsession0.GstPad:send_rtcp_src: caps = NULL

Setting pipeline to NULL ...

Freeing pipeline ...

When streaming from the PC, I'm using

gst-launch -v gstrtpbin name=rtpbin filesrc location=/home/*/*/big_buck_bunny_1080p_h264_HQ.mov typefind=true ! qtdemux ! rtph264pay ! rtpbin.send_rtp_sink_0 rtpbin.send_rtp_src_0 ! queue ! udpsink host=192.168.1.21 port=5000 rtpbin.send_rtcp_src_0 ! udpsink host=192.168.1.21 port=5001 sync=false async=false udpsrc port=5005 ! rtpbin.recv_rtcp_sink_0

This outputs:

Setting pipeline to PAUSED ...

/GstPipeline:pipeline0/GstFileSrc:filesrc0.GstPad:src: caps = video/quicktime

Pipeline is live and does not need PREROLL ...

/GstPipeline:pipeline0/GstQTDemux:qtdemux0.GstPad:sink: caps = video/quicktime

Setting pipeline to PLAYING ...

New clock: GstSystemClock

/GstPipeline:pipeline0/GstRtpBin:rtpbin.GstGhostPad:send_rtcp_src_0: caps = application/x-rtcp

/GstPipeline:pipeline0/GstRtpBin:rtpbin/GstRtpSession:rtpsession0.GstPad:send_rtcp_src: caps = application/x-rtcp

/GstPipeline:pipeline0/GstUDPSink:udpsink1.GstPad:sink: caps = application/x-rtcp

/GstPipeline:pipeline0/GstRtpBin:rtpbin.GstGhostPad:send_rtcp_src_0.GstProxyPad:proxypad3: caps = application/x-rtcp

ERROR: from element /GstPipeline:pipeline0/GstQTDemux:qtdemux0: GStreamer encountered a general stream error.

Additional debug info:

qtdemux.c(3891): gst_qtdemux_loop (): /GstPipeline:pipeline0/GstQTDemux:qtdemux0:

streaming stopped, reason not-linked

Execution ended after 2227471 ns.

Setting pipeline to PAUSED ...

Setting pipeline to READY ...

/GstPipeline:pipeline0/GstUDPSink:udpsink1.GstPad:sink: caps = NULL

/GstPipeline:pipeline0/GstRtpBin:rtpbin.GstGhostPad:send_rtcp_src_0: caps = NULL

/GstPipeline:pipeline0/GstRtpBin:rtpbin/GstRtpSession:rtpsession0.GstPad:send_rtcp_src: caps = NULL

/GstPipeline:pipeline0/GstQTDemux:qtdemux0.GstPad:audio_00: caps = NULL

/GstPipeline:pipeline0/GstQTDemux:qtdemux0.GstPad:video_00: caps = NULL

/GstPipeline:pipeline0/GstQTDemux:qtdemux0.GstPad:sink: caps = NULL

/GstPipeline:pipeline0/GstFileSrc:filesrc0.GstPad:src: caps = NULL

Setting pipeline to NULL ...

Freeing pipeline ...

I've tried different variations of these pipelines and so far the only successful one i've tried is by demuxing/decoding the .mov file and then re-encoding the file and streaming from the PC to the board. This gives a good picture but has a very poor framerate (I persume due to the computer decoding/encoding H.264 simultaneously!) If anyone has any ideas, please let me know!

1 Solution
3,122 Views
timothybean
Contributor IV

Hi,

It looks like the caps are not getting all the way through to the rest of the pipeline.

1. MAKE sure, for this method, that you start the decoder BEFORE you start the sender.

2. Try to add in the config-interval setting on rtph264pay so it send NAL headers every so often

3. Put a h264parse element after the rtph264depay element in the receiver side

4. Run your pipeline with more verbosity.... like gst-launch --gst-debug=rtpbin:5 ...

That should show you a little more of what is happening.

But, if you are not planning on doing it this way in the real world scenario, then I might put my effort into the gst-rtsp-server OR put the video in a container like MPEG2-TS.

Here is the link to the server:

http://people.freedesktop.org/~wtay/

Tim

View solution in original post

4 Replies
3,122 Views
YixingKong
Senior Contributor IV

Hi

Had your issue got resolved? If yes, we are going to close the discussion in 3 days. If you still need help, please feel free to reply with an update to this discussion.

Thanks,
Yixing

0 Kudos
3,122 Views
timothybean
Contributor IV

Can you post your full receiving pipeline? Have you put in your proper caps on the receiving side?

When you mux/demux the video in the .mov container, it is also sending the information about the video so that rtph264depay can use it as well as vpudec. So, I think from looking at what you have here, the caps are not getting set properly. If you can post the whole receiver pipeline I could tell.

You could also send the video in a container, such as MPEG2TS that would do this for you. I think you may also want to use the config-interval setting of rtph264pay to generate the NAL headers every so often.

The other method could be to use the gstreamer rtsp server. Then the caps would be send via SDP in the RTSP session. You would then use rtspsrc on the receiving side...

This method would do the same thing you are doing now, just facilitating the caps transfer. Only downside is that it is a state-based connection that needs to be maintained.

Tim

3,122 Views
smwsmart-e
Contributor II

Hi Tim,

Thanks for the reply I've attached the two text files which show the pipeline and what is output on the terminal. Since posting I've had some success using YUV files streaming out to the PC. These include a CIF file which works well, and a 1080p which works but is very slow (2-3 fps). I'll look into implementing the rstp server and client and report back soon!

0 Kudos
3,123 Views
timothybean
Contributor IV

Hi,

It looks like the caps are not getting all the way through to the rest of the pipeline.

1. MAKE sure, for this method, that you start the decoder BEFORE you start the sender.

2. Try to add in the config-interval setting on rtph264pay so it send NAL headers every so often

3. Put a h264parse element after the rtph264depay element in the receiver side

4. Run your pipeline with more verbosity.... like gst-launch --gst-debug=rtpbin:5 ...

That should show you a little more of what is happening.

But, if you are not planning on doing it this way in the real world scenario, then I might put my effort into the gst-rtsp-server OR put the video in a container like MPEG2-TS.

Here is the link to the server:

http://people.freedesktop.org/~wtay/

Tim