VPU: timestamps, and audio/video sync
11-03-2011
01:39 PM
1,999件の閲覧回数
PhilEndecott
Contributor II
- 新着としてマーク
- ブックマーク
- 購読
- ミュート
- RSS フィードを購読する
- ハイライト
- 印刷
- 不適切なコンテンツを報告
Does anyone here know anything about the VPU? I have basic video playback working, and now I need to get audio-video sync correct. This is from an MPEG transport stream (DVB) and I believe that I need to use the Presentation Time Stamp (PTS) fields from the two PES streams for synchronisation. It is simple enough to feed the two streams into the two decoders in synchrony. Unfortunately this is insufficient, as the latency from writing to the VPU input buffer to getting a frame out is variable. According to the MPEG spec, the correct way to do this is to feed in the data to both audio and video decoders as it is received, and to fetch video frames and audio data out of the decoders in synchrony. The difficulty is that I don't know the PTS timestamps for the frames as they come out. Ideally, I think I'd like the VPU to report the corresponding PTS value in the DecOutputInfo struct, however this does not seem to be present. Has anyone else ever looked at this? (Are any of Freescale's VPU engineers reading this?) Thanks, Phil.
3 返答(返信)
11-04-2011
07:41 AM
1,475件の閲覧回数
KanHU
Contributor III
- 新着としてマーク
- ブックマーク
- 購読
- ミュート
- RSS フィードを購読する
- ハイライト
- 印刷
- 不適切なコンテンツを報告
it's under libs/me/ of gst-fsl-plugins package. It's a really simple algorithm.
I'm not look at ffmpeg code, but I think there should be some algorithm in that too for timestamp.
11-04-2011
07:17 AM
1,475件の閲覧回数
PhilEndecott
Contributor II
- 新着としてマーク
- ブックマーク
- 購読
- ミュート
- RSS フィードを購読する
- ハイライト
- 印刷
- 不適切なコンテンツを報告
Thanks for the reply. No, I'm not using a "multimedia framework". Can you point me to the source for this gstreamer algorithm? What you describe sounds like a bit of a hack.
11-03-2011
06:38 PM
1,477件の閲覧回数
KanHU
Contributor III
- 新着としてマーク
- ブックマーク
- 購読
- ミュート
- RSS フィードを購読する
- ハイライト
- 印刷
- 不適切なコンテンツを報告
Are you use some multimedia framework? like Freescale provide gstreamer plugins which can handle AV sync.
Currently VPU gstreamer plugins provided by Freescale handle the timestamp with some simple algorithm. The precondition is we get PTS for a frame based, and the plugin will hold a pool to hold the timestamp it received and get timestamp from the pool when a decompressed frame output, And there's some algorithm to reorder the timestamp or correct it since some of them are illegal values.