Multiple-Overlay (or Multi-Overlay) means several video playbacks on a single screen. In case multiple screens are needed, check the dual-display case GStreamer i.MX6 Multi-Display
$ export VSALPHA=1
$ SAMPLE1=sample1.avi; SAMPLE2=sample2.avi; SAMPLE3=sample3.avi; SAMPLE4=sample4.avi;
$ WIDTH=320; HEIGHT=240; SEP=20
playbin2 uri=file://`pwd`/$SAMPLE1 video-sink="mfw_isink axis-top=0 axis-left=0 disp-width=$WIDTH disp-height=$HEIGHT" \
playbin2 uri=file://`pwd`/$SAMPLE2 video-sink="mfw_isink axis-top=0 axis-left=`expr $WIDTH + $SEP` disp-width=$WIDTH disp-height=$HEIGHT" \
playbin2 uri=file://`pwd`/$SAMPLE3 video-sink="mfw_isink axis-top=`expr $HEIGHT + $SEP` axis-left=0 disp-width=$WIDTH disp-height=$HEIGHT" \
playbin2 uri=file://`pwd`/$SAMPLE4 video-sink="mfw_isink axis-top=`expr $HEIGHT + $SEP` axis-left=`expr $WIDTH + $SEP` disp-width=$WIDTH disp-height=$HEIGHT"
playbin2 uri=file://`pwd`/$SAMPLE1 video-sink="mfw_isink axis-top=0 axis-left=0 disp-width=$WIDTH disp-height=$HEIGHT rotation=0" \
playbin2 uri=file://`pwd`/$SAMPLE2 video-sink="mfw_isink axis-top=`expr $HEIGHT + $SEP` axis-left=0 disp-width=$WIDTH disp-height=$HEIGHT rotation=3"
Thanks for these examples I found them very useful!
But I have a question: Is video quality of mfw_isink not as good as with mfw_v4lsink? It seems like there is less colors or something and quality is not acceptable. What is basic difference of these two? (I am newbie in this area.)
I am get better quality with mfw_v4lsink but don't success to play multiple videos at same time with that. I am trying to make PIP view where small video is on corner while other is full screen. Is it possible to make that with mfw_v4lsink or somehow get mfw_isink produce better quality?
I am using Sabre Lite development board and has L3.0.35_1.1.0_oneiric ubuntu image in it. Playing to HDMI out with 1920x1080 resolution.
I would be very grateful of advice on this.
Responding your questions:
1. Is video quality of mfw_isink not as good as with mfw_v4lsink?
Quality should be the same in both elements.
2. What is basic difference of these two?
Both are similar elements, in the sense that both are video sinks, however, the isink element uses the IPU lib, which allows to have multi-overlay, meaning that several playbacks can run on the same device. This is not possible using the mfw_v4lsink element. Check the properties for each element using the cmd 'gst-inspect'. Also, check the i.MX Linux Multimedia Framework User's Guide for more info.
3. Is it possible to make that with mfw_v4lsink or somehow get mfw_isink produce better quality?
it is not possible, v4lsink does not allow multi-overlay. v4lsink can be use in case multiple videos are played, each on a different display (this is commonly called multi-display, where the most common case is dual-display).
Can you do thy this two pipelines
$ gst-launch filesrc location=sample.mp4 typefind=true ! aiurdemux ! queue max-size-time=0 ! vpudec ! mfw_v4lsink
$ gst-launch filesrc location=sample.mp4 typefind=true ! aiurdemux ! queue max-size-time=0 ! vpudec ! mfw_isink
Is the quality the same? Can you run the same pipeline BUT with a lower resolution media file? Let me know your results.
I'd like to know how I can take advantage of the i.MX6Q HW engine to play multiple videos across 2 displays.
1. What is the maximum number of 720p or 480p h.264 streams that i.MX6 can decode in HW?
2. Can more than four videos be played on i.MX6Q via mfw_isink at the same time on one display, or is 4 the limit?
3. Can I play 4 720p streams on display1 via mfw_isink and another 1080p on another display2?
4. What happens when I need to change/remove one of the videos, will other live streams be affected?
5. What do you expect the CPU utilization to be while those videos are playing, assuming no network streaming overhead?
Thanks for the answers. I now tried pipelines you suggested with 1920 x 880, 720 x 432 and 608 x 256 videos and with all the quality with mfw_isink is not as good as with mfw_v4lsink. Like isink has less colors or v4lsink does some smoothening: especially in places where there is gradient kind of color slide with isink I can see clear steps where color changes but with v4lsink it is smoother. Any more suggestions?
And what is the relation between screen depth given in bootargs and /usr/share/vssconfig?
I have in bootargs "video=mxcfb0:dev=hdmi,1920x1080@60,if=RGB24,bpp=32" to make it truecolor but "format = RGBP" in vssconfig. I have tried to use RGB3 or RGB4 in vssconfig but that did now work (caused strange colors or flickering).
exactly, same here.
while we are focusing dual display with mx6, it's gotta be very critical to us.
you can find another issue and capture images with my last post.
quick response would be very appreciated.
Please see the attached file (3 is the answer for 720p@30fps videos)
Up to 4 videos.
3. Can I play 4 720p streams on display1 via mfw_isink and another 1080p on another display2?
Let me investigate on this particular one.
Could you please explain what do you mean by 'change/remove'?
I have never 'top' the CPU when doing a playback, but I assumed most of the work is done on the VPU. However, for a single playback using playbin2, lots of intermediate elements (elements between the source and the demuxer, between the demuxer and the decoder, between the decoder and the sink) are created, so even if the data processing is done on the VPU, buffer passing between elements is done at user space side, so CPU is definitely doing some work. I will post some numbers if case I found these internally.
Heikki: the quality issue (mfw_isink versus mfw_v4lsink) is currently being investigated. I will update this thread ASAP.
only mfw_isink have multi-video support on multi display devices. mfw_isink is based on ipu dev IOCTL.
Do you have more detailed steps to reproduce the issues? What test clips are used?
heikkiti, can you please give more details of the setup you have when you run the video sink pipelines? clips? etc.
xiaoli.zhang, could you answer questions posted by Heikki on this comment ?
Setup is Sabre Lite board with L3.0.35_1.1.0_oneiric ubuntu image. I can see problem with any video but e.g. 2D High Definition Trailers (HD) - Demo World / Taste Of Kitchen (e.g. tomatoes look much better with v4l_sink). You can play with pipelines you suggested. In bootargs I have video=mxcfb0:dev=hdmi,1920x1080@60,if=RGB24,bpp=32 and have HDMI monitor connected.
Thanks Heikki. Xiaoli is investigating this issue.
Did you get to the bottom of this? What I've found is it's not the sink element issue but mfw_ipucsc issue. If the source and sink both supports the same capabilities, hence no need for the IPU to do anything, the colour shows up fine. However for example, if the IPU needs to convert YUV to RGB, it looks like it has the "effect" of down-sampling an image with 16bit colour down to 8 bit.
xiaoli.zhang, could you confirm what Steve is mentioning. EricNelson, have you seen this behaviour on sabre lite (mfw_v4lsink frames look better than mfw_isink frames) ?
The only things I've noticed are the obvious artifacts of alpha-blending, but it's easy enough to test.
I would like to test 16 QVGA videos.
How to change the maximum from 4 to 16? Thanks
Hi, please try what suggested here: https://community.freescale.com/message/313599#313599
I checked with internal team. It seems we didn't observe the issue. Can you please clarify what YUV format conversion has the problem? Or please re-share your test command line? Thanks.
Thanks for the reply, but I can’t access it even I login already.
Attached please find the problem I have.
In file: mfw_gst_vss_common.c
vd->vsmax = 4
Change this to how may videos you want to play
In vss/mfw_gst_vss_common.h :
#define VS_MAX 8
Need to change this if you want to do more than 8
It works at six D1 720x480 decoding and displays simultaneously and smoothly.
Theoretically, this VPU should be very easily to decode 16 QVGA, however I got the problem while only playing 9 video streaming sources.
Attached please find my command line and the error “ There may be timestamping problem, or this computer is too slow…”
Actually, I also got this problem very often while playing only one 1080p video source file.
Thank you for your contribution.I'm working with gstreamer using fsl-image-gui using v4lsink. However when I try to use mfw_isink, gstreamer tells me :WARNING: erroneous pipeline: no element "mfw_isink".
So if I inspect (gst-inspect | grep "isink"), you'll see this element is not installed:alsa: alsasink: Audio sink (ALSA)
soup: souphttpclientsink: HTTP client sink
xvimagesink: xvimagesink: Video sink
playback: playsink: Player Sink
gio: giosink: GIO sink
gio: giostreamsink: GIO stream sink
autodetect: autovideosink: Auto video sink
autodetect: autoaudiosink: Auto audio sink
v4lsink.imx: mfw_v4lsink: v4l2 video sink
coreelements: fakesink: Fake Sink
coreelements: fdsink: Filedescriptor Sink
coreelements: filesink: File Sink
tcp: tcpclientsink: TCP client sink
tcp: tcpserversink: TCP server sink
tcp: multifdsink: Multi filedescriptor sink
fbdevsink: fbdevsink: fbdev video sink
How can I add it ?
Are you using latest LTIB or Yocto Framework?
I'm using Yocto Framework.
Can you send this issue to the meta-freescale distribution list? There is one issue but it assumes you already have isink :smileysad:
Thanks for the tutorial, it has been very helpful. But the title of this page is "Multi-Overlay". Is it possible to actually lay one video on top of the other and still see both playing?
I think the last launched overlays the one in the back, but also the alpha blending comes into play. rogeriopimentel, any idea?
I'm using i.MX6 with Yocto BSP for a multimedia application.
is there any way to obtain video transition effects (i.e. dissolvence) with vpu accelerated plugin? I've testes some standard gstreamer plugin (smpte, smptealpha, videomixer) but all them have big performarce issues.
Thanks in avance
I am not aware of any... what is the particular gst element you want to accelerate?
Ideally should be nice to have an accelerated version of smpte, that is a nice and complete plugin for transition effects between two video sinks. But also videomixer could be enough, using a GstController for dynamically control the alpha coefficient of each sink, and obtaining a fading transition between two inputs.
I would like to record 2 webcam at the same file.
Today, I can play and record videos using v4l, some thing like this:
gst-launch filesrc location=MatrixXQ.avi typefind=true ! aiurdemux ! queue max-size-time=0 ! vpudec ! mfw_v4lsink
gst-launch v4l2src device=/dev/video0 ! video/x-raw-yuv,width=640,height=480 ! avimux ! filesink location=test0.avi
gst-launch v4l2src device=/dev/video0 ! video/x-raw-yuv,width=320,height=240,framerate=10/1 ! mfw_v4lsink
I also could play 2 videos at the same different parts of the screen.
gst-launch playbin2 uri=file:///home/root/video/MatrixXQ.avi video-sink="mfw_isink axis-top=0 axis-left=0 disp-width=320 disp-height=240" playbin2 uri=file:///home/root/video/hd_other_samsung_led_motion.mp4 video-sink="mfw_isink axis-top=0 axis-left=`expr 320 + 20` disp-width=320 disp-height=240" playbin2 uri=file:///home/root/video/swiss.mp4 video-sink="mfw_isink axis-top=`expr 240 + $SEP` axis-left=0 disp-width=320 disp-height=240"
My point is, there is any way to get the source Video from 4l2src device=/dev/video1 and 4l2src device=/dev/video0 and then mount it in the same stream and record it in a FILE?
Thanks for all help.
I have the same question with munoz0raul. Could you give some help? LeonardoSandovalGonzalez
Still waiting some example.
Any one know if its possible to do it?