Gstreamer h264 decoding latency on iMX6

取消
显示结果 
显示  仅  | 搜索替代 
您的意思是: 
已解决

Gstreamer h264 decoding latency on iMX6

跳至解决方案
12,842 次查看
joelcolledge
Contributor I

Hi all,

 

I am using an iMX6 quad to decode and display a live h264 encoded video feed using gstreamer.  I am using the following pipeline:

appsrc block=true is-live=true !

h264parse !

queue max-size-time=0 max-size-buffers=0 !

vpudec low-latency=true frame-plus=1 framedrop=false !

mfw_v4lsink device=/dev/video17 sync=false async=false


However I have found that there is a 12 frame delay between pushing a buffer into the appsrc, and it being displayed.

 

This may be connected to the following message that mfw_v4lsink is printing:

>>V4L_SINK: Actually buffer status:

        hardware buffer : 12

        software buffer : 0

 

Does anyone have any suggestions about how to reduce this delay?

 

I have attached a test application which demonstrates the problem.

 

Thanks,

 

Joel

Original Attachment has been moved to: gsttest.c.zip

标签 (2)
标记 (2)
0 项奖励
1 解答
4,223 次查看
ChucoChe
NXP Employee
NXP Employee

Just for reference, I'm adding here that setting vpudec->context.openparam.nReorderEnable = 0 reduces the delay to 2 frames. When there are no B frames and no reorder is required.

在原帖中查看解决方案

0 项奖励
11 回复数
4,223 次查看
holgerweber
Contributor IV

I also try to reduce the latency of the video playback pipeline. here: Re: RTSP gstreamer and vpudec

I currently no ltib environment installed and I only want to find out if we can reach our latency requirements.

Can you send me the generated .so file (generated out of vpudec.c)?

0 项奖励
4,223 次查看
Tarek
Senior Contributor I

Hi Joel,

Have you tried to use the appsrc timestamp instead of the system clock?

g_object_set(appsrc,"do-timestamp",TRUE, NULL);

and comment out this line:

//GST_BUFFER_TIMESTAMP(buffer) = timestamp;

0 项奖励
4,222 次查看
joelcolledge
Contributor I

I have been able to reduce this to a 4 frame delay by disabling reordered frames, this is done by changing vpudec.c:1270 to

vpudec->context.openparam.nReorderEnable = 0

This works because my stream contains no B-frames.  It results in a significant performance improvement.

I would still be interested to hear from anyone who has ideas about how to reduce this further.

0 项奖励
4,224 次查看
ChucoChe
NXP Employee
NXP Employee

Just for reference, I'm adding here that setting vpudec->context.openparam.nReorderEnable = 0 reduces the delay to 2 frames. When there are no B frames and no reorder is required.

0 项奖励
4,222 次查看
brettkuehner
Contributor II

You mention making changes to vpudec.c to reduce latency. I have built the system using ltib, and can run gstreamer fine, but I don't see vpudec source anywhere, and I can't find it online. Where did you get it from?

0 项奖励
4,222 次查看
ChucoChe
NXP Employee
NXP Employee

On ltib you need to run ./ltib -m prep -p gst-fsl-plugins to get the source.

The source would be at <ltib directory>/rpm/BUILD.

To build you need to run ./ltib -m scbuild then ./litb -m scdeploy.

0 项奖励
4,222 次查看
LeonardoSandova
Specialist I

Have you tried the same pipeline using gst-launch? Do you see the same delay?

0 项奖励
4,222 次查看
nagendrasarma
Contributor III

hi Freescale,

i am using following pipeline to encode and stream the h264 elementary video. on the other hand vlc media player (windows) is been used to decode the incoming video (sdp file)

the video is playing properly, but the latency is more than  2 seconds, could you please help me how to reduce latency

FYI

HW : MCIMX6Q-SDP SW : Poky (Yocto Project Reference Distro) 1.5.1 \n \l kernel :3.0.35-4.1.0+yocto+gbdde708

pipelines used

gst-launch-0.10 -vv mfw_v4lsrc ! video/x-raw-yuv, framerate=30/1, width=640, height=480 !

  vpuenc codec=6  ! rtph264pay ! udpsink host=<ipaddr> port=xxx sync=false

regards,

Nagendra

0 项奖励
4,222 次查看
nagendrasarma
Contributor III

any updates ?

0 项奖励
4,222 次查看
holgerweber
Contributor IV

Are you sure, that the encoding is the problem? Which buffer size do you use in VLC (advanced settings)? The default buffer size is 2000ms in vlc (as I know).

0 项奖励
4,222 次查看
joelcolledge
Contributor I

I don't believe I can use the same pipeline with gst-launch because of the appsrc, however replacing it with fdsrc and piping data to the corresponding file descriptor gives exactly the same results.  That is, the first frame is only displayed after 12 frames have been passed into the pipeline.

Using playbin2 (with fd uri protocol) has a similar effect, only with some extra issues that seem to be due to buffering.

0 项奖励