Video decoding/rendering pipeline starup delay

取消
显示结果 
显示  仅  | 搜索替代 
您的意思是: 

Video decoding/rendering pipeline starup delay

727 次查看
brainmeltdown
Contributor I

I'm using the imx gstreamer imxvpudec plugin in the context of a video over IP phone.  The pipeline it uses to decode and render H.264 video received from the network is written in C but essentially is

      appsrc ! h264parse ! imxvpudec ! imxipuvideotransform ! imxg2dvideosink.  

The code is running on a Boundary Devices BD-SL-I.MX6 which I gather was originally called Sabre Lite and was the original i.MX6 reference design board.  The video display device is an HDMI monitor at /dev/video1.  The input is from appsrc because packets are coming in via a non-Gstreamer RTP stack.  

I have log messages when the code sets the pipeline state to PLAY and log messages for pipeline bus events.  It seems that from setting the pipeline to PLAY until the state changes from PAUSED to PLAYING takes only about 1 second which is acceptable.  But decoded video is not rendered on the HDMI display for nearly 4 seconds from start of receiving packets.  

Where might such a long delay most likely arise?  Once the video starts rendering the latency is not too terrible.  I would estimate it at less than 500 ms.  

标签 (4)
0 项奖励
回复
1 回复

571 次查看
art
NXP Employee
NXP Employee

The most likely cause of the delay you observe is the fact that the stream decoding can be started only on an I-frame (initial frame), that may appear relatively rare (e.g. once a some seconds) within a stream. The frequency of I-frames appearance depends on the video source side and cannot be improved on the receiving side in any way.


Have a great day,
Artur

-----------------------------------------------------------------------------------------------------------------------
Note: If this post answers your question, please click the Correct Answer button. Thank you!
-----------------------------------------------------------------------------------------------------------------------

0 项奖励
回复