H.264 decoding using ffmpeg + using GPU for display acceleration

キャンセル
次の結果を表示 
表示  限定  | 次の代わりに検索 
もしかして: 

H.264 decoding using ffmpeg + using GPU for display acceleration

ソリューションへジャンプ
25,229件の閲覧回数
rebelalliance
Contributor III

Hello,

I have a situation where I need to decode low-latency H.264 streams (also known as intra-refresh streams).  I have confirmed that the iMX6 cannot decode such streams in hardware because the hardware decoder requires a full I frame which these streams do not contain.  So what I would like to do is to use ffmpeg to decode the stream in software and then push the decoded output to the GPU for display.

Has anyone done this or something similar?  What would be a recommended architecture for such a sequence?  I have tried using MPlayer + SDL + directFB but the video is still choppy.  I am open to using gstreamer if at all possible (perhaps use ffmpeg plugin?).  I am just trying to brainstorm the best solution for this.

Any input appreciated.

PS: I am using the SabreSD development board with L3.0.35_4.0.0_130424_source.tar.gz to build my rootfs, bootloader, and kernel.

ラベル(4)
1 解決策
15,117件の閲覧回数
rebelalliance
Contributor III

I got intra-refresh streams to decode correctly on my i.MX6 using the attached patch.

It also appears aiurdemux cannot demux my streams, neither normal nor low latency, so I used mpegtsdemux from the gst bad plugins collection and it works fine.

Problem solved.

元の投稿で解決策を見る

26 返答(返信)
1,371件の閲覧回数
rebelalliance
Contributor III

To rule out the aiurdemux (freescale supplied) demuxer from the equation, I used mpegtsdemux which is an open source gstreamer plugin.  So I launched a pipeline to demux and save the low-latency TS stream to a file:

gst-launch rtspsrc location=rtsp://<ip> ! rtpmp2tdepay ! mpegtsdemux ! 'video/x-h264' ! filesink location=/path/to/myfile.ts

Then, I passed "myfile.ts" from the above step to mxc_vpu_test.out like so:

./mxc_vpu_test.out -D "-i /root/myfile.ts -f 2 -w 1920 -h 1080 -s 1 -t 1"

I did the above 2 steps for both low latency streams and regular streams and in both cases ./mxc_vpu_test.out successfully played the generated TS file.  So, obviously, the h.264 hardware can play these files but when one uses the "vpudec" plugin from gstreamer, it doesn't play!  Also, the demuxing is just fine otherwise we'd get junk files.  So it seems to me that FSL has deliberately designed the vpudec plugin to not play these files, even if the hardware is capable of it.  Thus, I am confused about Jack's comments about low-latency being unsupported.

I am attaching the two generated TS files used in my experiment.

Please advise.

0 件の賞賛
返信
1,371件の閲覧回数
karina_valencia
NXP Apps Support
NXP Apps Support

ChucoChe I answered   you by email.

0 件の賞賛
返信
1,371件の閲覧回数
ChucoChe
NXP Employee
NXP Employee

If you mean using a video as a texture there is an example in the GPUSDK under Demos/GLES2.0/simple_gpu_player.

You can download the GPUSDK from freescale's webpage.

Michel

0 件の賞賛
返信
1,371件の閲覧回数
LeonardoSandova
Specialist I

GPU is just for graphics processing, so what you mean is IPU, right? Have you execute a pipeline using ffmpeg elements? Is the performance good?

Leo

0 件の賞賛
返信
1,371件の閲覧回数
rebelalliance
Contributor III

Sorry I was away for a bit.

So I am trying two approaches: mplayer (which does decode intra-refresh videos) and gstreamer with ffmpeg plugins (not sure if this does intra-refresh videos but I just wanna get it working and maybe will be able to tweak ffmpeg to decode these videos later).

For mplayer:

I did try mplayer (latest) but the video was very choppy.  I don't know if it is the decoding of the H264 algorithm that is taking time or if its the colorspace conversion that is slowing things down.  So I recompiled mplayer with directfb (old version included in LTIB) but this pretty much hangs the system sometimes or just craps out without hanging the system and also not showing any video, choppy or otherwise.  I will post the details of these directfb errors later.

For gstreamer ffmpeg plugin:

I just enabled FFMPEG plugins for gstreamer in LTIB.  The ffmpeg plugins did not run because a couple APIs have been removed so when I use gst-inspect, it came up with unresolved symbol error on these plugins.  I will probably update these ffmpeg plugins to the latest version but decided to just fix the minor issues in the source code for now.  So now I do have the ffmpeg plugins and I will try to build a pipeline to decode H264 Transport Stream files in software.

If anyone knows the pipeline for ffmpeg using gst-launch, please advise.

I will update with any issues I encounter.

0 件の賞賛
返信
1,371件の閲覧回数
LeonardoSandova
Specialist I
0 件の賞賛
返信