Given two possible ways to launch a video:
gst-launch filesrc location=sintel-1024-surround.mp4 typefind=true ! aiurdemux ! vpudec frame-plus=1 ! mfw_ipucsc ! mfw_v4lsink
gst-launch filesrc location=sintel-1024-surround.mp4 typefind=true ! aiurdemux ! vpudec frame-plus=1 ! mfw_v4lsink
Both of these pipelines work, however the first containing mfw_ipucsc should be using hardware to convert the vpudec yuv output to RGB that is used to write the output
into the framebuffer /dev/fb0
The second form is identical to the pipe that is created (automatically) when using playbin2.
This raises the question of how mfw_v4lsink handles color space conversion when it is fed yuv on its input? In software?
This question is triggered by the following observation:
We have an application written in Qt5 that uses the webkit to open e.g. a YouTube page and then playing a movie.
With GST_DEBUG=2 we see a lot of frame buffers being dropped and the movie plays "bad" and the iMX6 runs hot until the point that the "cooling" device reduces the cpu frequency.
Webkit uses playbin2 to launch the video.
So somehow I guess that the vpudec yuv output gets software converted to be merged in the framebuffer output and that takes a lot of cpu effort.
If the above holds, why does mfw_4vlsink not automatically use the ipucsc when it is fed yuv on a iMX6?
As an experiment I removed the yuv formats from the mfw_v4lsink. playbin2 then generates an invalid pipe error. It is unable to automatically incorporate the mfw_ipucsc that provides yuv to RGB conversion.
The gst-launch with mfw_ipuscs still functions correctly with this modification.
Is there another way to force playbin2 to use mfw_ipuscs?