Hello,
I have an iMX6 transmitting live video to a Ubuntu PC as receiver. Following are my pipelines :
Tx :
imxv4l2videosrc device=/dev/video1 fps-n=30 capture-mode=4 ! \
imxvpuenc_h264 bitrate=1000 ! \
h264parse ! \
rtph264pay ! \
udpsink host=192.168.1.11 port=5001 \
rtpbin.send_rtcp_src_0 ! udpsink port=5002 \
udpsrc port=5005 ! rtpbin.recv_rtcp_sink_0
Rx :
udpsrc name=source port=5001 ! $CAPS !
rtpbin.recv_rtp_sink_0 rtpbin. ! \
rtph264depay ! \
h264parse ! \
avdec_h264 ! \
videorate drop-only=true ! \
intervideosink sync=false \
udpsrc port=5002 ! rtpbin.recv_rtcp_sink_0
rtpbin.send_rtcp_src_0 ! udpsink port=5005 sync=false async=false
I added a probe on `udpsrc`, on the Rx side, to get notified of `GST_PAD_PROBE_TYPE_PUSH`. I understand that for every buffer push, the probe callback will be called.
Now in case I have a camera operating at 30 FPS, and transmitting this over a network, I was expecting there should be approx. 30 buffers I receive at the udpsrc (receiver side).
1. Is this understanding right?
2. Does one push-buffer correspond to 1 video frame send?
With my current debugging, I find random number of buffers being pushed in a second, for eg. the following buffers were pushed for a period of 4 seconds : 112,130,142,107 . I used the Python timer api `time.perf_counter()`, to get timestamps of each buffer in the callback function for GST_PAD_PROBE_TYPE_PUSH.
My main idea was to get the timestamp of each frame received at the udpsrc at the receiver side. How could this be achieved?
Regards.
what problem do you have? I suggest that you can refer to the "RTP/UDP MPEGTS streaming" of linux user guide, which tell you how to set different parameters, like streaming-latency and low_latency_tolerance, and you can set do-timestamp=false