<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: [iMX8MPlus] How to mix two video sources into one with Gstreamer ? in i.MX Processors</title>
    <link>https://community.nxp.com/t5/i-MX-Processors/iMX8MPlus-How-to-mix-two-video-sources-into-one-with-Gstreamer/m-p/1571617#M198920</link>
    <description>&lt;P&gt;Hi &lt;a href="https://community.nxp.com/t5/user/viewprofilepage/user-id/183729"&gt;@dianapredescu&lt;/a&gt;,&lt;/P&gt;&lt;P&gt;Applying the patches I mentioned in previous comment, I was able to test following cases with BSP-5.10.72 and they seemed to work:&lt;/P&gt;&lt;P&gt;1. video composition of two 1440x1080 streams and displaying it on a HDMI display.&amp;nbsp;&lt;/P&gt;&lt;P&gt;a. From video test sources :&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;DISPLAY=:0 gst-launch-1.0 imxcompositor_g2d name=comp \
sink_0::xpos=0 sink_0::ypos=0 sink_0::width=1440 sink_0::height=1080 \
sink_1::xpos=0 sink_1::ypos=1080 sink_1::width=1440 sink_1::height=1080 \
! queue ! imxvideoconvert_g2d ! video/x-raw, width=1440, height=2160 ! glimagesink render-rectangle="&amp;lt;0,0,1440,2160&amp;gt;" \
videotestsrc ! imxvideoconvert_g2d ! video/x-raw, width=1440, height=1080 ! queue ! comp.sink_0 \
videotestsrc ! imxvideoconvert_g2d ! video/x-raw, width=1440, height=1080 ! queue ! comp.sink_1&lt;/LI-CODE&gt;&lt;P&gt;b. From camera sensor sources :&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;DISPLAY=:0 gst-launch-1.0 imxcompositor_g2d name=comp \
sink_0::xpos=0 sink_0::ypos=0 sink_0::width=1440 sink_0::height=1080 \
sink_1::xpos=0 sink_1::ypos=1080 sink_1::width=1440 sink_1::height=1080 \
! queue ! imxvideoconvert_g2d ! video/x-raw, width=1440, height=2160 ! glimagesink render-rectangle="&amp;lt;0,0,1440,2160&amp;gt;" \
v4l2src device=/dev/video2 ! imxvideoconvert_g2d ! video/x-raw, width=1440, height=1080 ! queue ! comp.sink_0 \
v4l2src device=/dev/video3 ! imxvideoconvert_g2d ! video/x-raw, width=1440, height=1080 ! queue ! comp.sink_1&lt;/LI-CODE&gt;&lt;P&gt;2. video composition of two 1440x1080 streams, H264 encoding/decoding and&amp;nbsp;displaying it on a HDMI display&lt;/P&gt;&lt;P&gt;a. From video test sources :&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;DISPLAY=:0 gst-launch-1.0 imxcompositor_g2d name=comp \
sink_0::xpos=0 sink_0::ypos=0 sink_0::width=1440 sink_0::height=1080 \
sink_1::xpos=0 sink_1::ypos=1080 sink_1::width=1440 sink_1::height=1080 \
! queue ! imxvideoconvert_g2d ! video/x-raw, width=1440, height=2160, framerate=5/1 ! v4l2h264enc ! queue ! v4l2h264dec ! glimagesink render-rectangle="&amp;lt;0,0,1440,2160&amp;gt;" \
videotestsrc ! imxvideoconvert_g2d ! video/x-raw, width=1440, height=1080, framerate=5/1 ! queue ! comp.sink_0 \
videotestsrc ! imxvideoconvert_g2d ! video/x-raw, width=1440, height=1080, framerate=5/1 ! queue ! comp.sink_1&lt;/LI-CODE&gt;&lt;P&gt;b. From camera sensor sources :&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;DISPLAY=:0 gst-launch-1.0 imxcompositor_g2d name=comp \
sink_0::xpos=0 sink_0::ypos=0 sink_0::width=1440 sink_0::height=1080 \
sink_1::xpos=0 sink_1::ypos=1080 sink_1::width=1440 sink_1::height=1080 \
! queue ! imxvideoconvert_g2d ! video/x-raw, width=1440, height=2160, framerate=5/1 ! v4l2h264enc ! queue ! v4l2h264dec ! glimagesink render-rectangle="&amp;lt;0,0,1440,2160&amp;gt;" \
v4l2src device=/dev/video2  ! imxvideoconvert_g2d ! video/x-raw, width=1440, height=1080, framerate=15/1 ! queue ! comp.sink_0 \
v4l2src device=/dev/video3  ! imxvideoconvert_g2d ! video/x-raw, width=1440, height=1080, framerate=15/1 ! queue ! comp.sink_1&lt;/LI-CODE&gt;&lt;P&gt;3. video composition of two 1440x1080 streams, H264 encoding and streaming over network.&lt;/P&gt;&lt;P&gt;a. From video test sources :&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;gst-launch-1.0 imxcompositor_g2d name=comp \
sink_0::xpos=0 sink_0::ypos=0 sink_0::width=1440 sink_0::height=1080 \
sink_1::xpos=0 sink_1::ypos=1080 sink_1::width=1440 sink_1::height=1080 \
! queue ! videoconvert ! v4l2h264enc ! rtph264pay config-interval=1 pt=96 ! udpsink host=192.168.1.8 port=5000 \
videotestsrc ! imxvideoconvert_g2d ! video/x-raw, width=1440, height=1080 ! queue ! comp.sink_0 \
videotestsrc ! imxvideoconvert_g2d ! video/x-raw, width=1440, height=1080 ! queue ! comp.sink_1&lt;/LI-CODE&gt;&lt;P&gt;b. From camera sensor sources :&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;gst-launch-1.0 -vvv imxcompositor_g2d name=comp \
sink_0::xpos=0 sink_0::ypos=0 sink_0::width=1440 sink_0::height=1080 sink_1::xpos=0 \
sink_1::ypos=1080 sink_1::width=1440 sink_1::height=1080 ! queue ! videoconvert ! v4l2h264enc output-io-mode=dmabuf-import ! rtph264pay config-interval=1 pt=96 ! udpsink host=192.168.1.8 port=5000 \
v4l2src device=/dev/video2 io-mode=dmabuf ! imxvideoconvert_g2d ! "video/x-raw, width=1440, height=1080, framerate=15/1" ! queue ! comp.sink_0 \
v4l2src device=/dev/video3 io-mode=dmabuf ! imxvideoconvert_g2d ! "video/x-raw, width=1440, height=1080, framerate=15/1" ! queue ! comp.sink_1 &lt;/LI-CODE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;However, in the case 1.b), there seems to be following issue of performance :&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;root@imx8mpevk:~# DISPLAY=:0 gst-launch-1.0 imxcompositor_g2d name=comp \
&amp;gt; sink_0::xpos=0 sink_0::ypos=0 sink_0::width=1440 sink_0::height=1080 \
&amp;gt; sink_1::xpos=0 sink_1::ypos=1080 sink_1::width=1440 sink_1::height=1080 \
&amp;gt; ! queue ! imxvideoconvert_g2d ! video/x-raw, width=1440, height=2160 ! glimagesink render-rectangle="&amp;lt;0,0,1440,2160&amp;gt;" \
&amp;gt; v4l2src device=/dev/video2 ! imxvideoconvert_g2d ! video/x-raw, width=1440, height=1080 ! queue ! comp.sink_0 \
&amp;gt; v4l2src device=/dev/video3 ! imxvideoconvert_g2d ! video/x-raw, width=1440, height=1080 ! queue ! comp.sink_1
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Got context from element 'sink': gst.gl.GLDisplay=context, gst.gl.GLDisplay=(GstGLDisplay)"\(GstGLDisplayWayland\)\ gldisplaywayland0";
Pipeline is PREROLLED ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
WARNING: from element /GstPipeline:pipeline0/GstGLImageSinkBin:glimagesinkbin0/GstGLImageSink:sink: A lot of buffers are being dropped.
Additional debug info:
../git/libs/gst/base/gstbasesink.c(3136): gst_base_sink_is_too_late (): /GstPipeline:pipeline0/GstGLImageSinkBin:glimagesinkbin0/GstGLImageSink:sink:
There may be a timestamping problem, or this computer is too slow.
WARNING: from element /GstPipeline:pipeline0/GstGLImageSinkBin:glimagesinkbin0/GstGLImageSink:sink: A lot of buffers are being dropped.
Additional debug info:
../git/libs/gst/base/gstbasesink.c(3136): gst_base_sink_is_too_late (): /GstPipeline:pipeline0/GstGLImageSinkBin:glimagesinkbin0/GstGLImageSink:sink:
There may be a timestamping problem, or this computer is too slow.
WARNING: from element /GstPipeline:pipeline0/GstGLImageSinkBin:glimagesinkbin0/GstGLImageSink:sink: A lot of buffers are being dropped.
Additional debug info:
../git/libs/gst/base/gstbasesink.c(3136): gst_base_sink_is_too_late (): /GstPipeline:pipeline0/GstGLImageSinkBin:glimagesinkbin0/GstGLImageSink:sink:
There may be a timestamping problem, or this computer is too slow.
WARNING: from element /GstPipeline:pipeline0/GstGLImageSinkBin:glimagesinkbin0/GstGLImageSink:sink: A lot of buffers are being dropped.
Additional debug info:
../git/libs/gst/base/gstbasesink.c(3136): gst_base_sink_is_too_late (): /GstPipeline:pipeline0/GstGLImageSinkBin:glimagesinkbin0/GstGLImageSink:sink:
There may be a timestamping problem, or this computer is too slow.
WARNING: from element /GstPipeline:pipeline0/GstGLImageSinkBin:glimagesinkbin0/GstGLImageSink:sink: A lot of buffers are being dropped.
Additional debug info:
../git/libs/gst/base/gstbasesink.c(3136): gst_base_sink_is_too_late (): /GstPipeline:pipeline0/GstGLImageSinkBin:glimagesinkbin0/GstGLImageSink:sink:
There may be a timestamping problem, or this computer is too slow.
WARNING: from element /GstPipeline:pipeline0/GstGLImageSinkBin:glimagesinkbin0/GstGLImageSink:sink: A lot of buffers are being dropped.
Additional debug info:
../git/libs/gst/base/gstbasesink.c(3136): gst_base_sink_is_too_late (): /GstPipeline:pipeline0/GstGLImageSinkBin:glimagesinkbin0/GstGLImageSink:sink:
There may be a timestamping problem, or this computer is too slow.
WARNING: from element /GstPipeline:pipeline0/GstGLImageSinkBin:glimagesinkbin0/GstGLImageSink:sink: A lot of buffers are being dropped.
Additional debug info:
../git/libs/gst/base/gstbasesink.c(3136): gst_base_sink_is_too_late (): /GstPipeline:pipeline0/GstGLImageSinkBin:glimagesinkbin0/GstGLImageSink:sink:
There may be a timestamping problem, or this computer is too slow.
WARNING: from element /GstPipeline:pipeline0/GstGLImageSinkBin:glimagesinkbin0/GstGLImageSink:sink: A lot of buffers are being dropped.
Additional debug info:
../git/libs/gst/base/gstbasesink.c(3136): gst_base_sink_is_too_late (): /GstPipeline:pipeline0/GstGLImageSinkBin:glimagesinkbin0/GstGLImageSink:sink:
There may be a timestamping problem, or this computer is too slow.
WARNING: from element /GstPipeline:pipeline0/GstGLImageSinkBin:glimagesinkbin0/GstGLImageSink:sink: A lot of buffers are being dropped.
Additional debug info:
../git/libs/gst/base/gstbasesink.c(3136): gst_base_sink_is_too_late (): /GstPipeline:pipeline0/GstGLImageSinkBin:glimagesinkbin0/GstGLImageSink:sink:
There may be a timestamping problem, or this computer is too slow.
WARNING: from element /GstPipeline:pipeline0/GstGLImageSinkBin:glimagesinkbin0/GstGLImageSink:sink: A lot of buffers are being dropped.
Additional debug info:
../git/libs/gst/base/gstbasesink.c(3136): gst_base_sink_is_too_late (): /GstPipeline:pipeline0/GstGLImageSinkBin:glimagesinkbin0/GstGLImageSink:sink:
There may be a timestamping problem, or this computer is too slow.
WARNING: from element /GstPipeline:pipeline0/GstGLImageSinkBin:glimagesinkbin0/GstGLImageSink:sink: A lot of buffers are being dropped.
Additional debug info:
../git/libs/gst/base/gstbasesink.c(3136): gst_base_sink_is_too_late (): /GstPipeline:pipeline0/GstGLImageSinkBin:glimagesinkbin0/GstGLImageSink:sink:
There may be a timestamping problem, or this computer is too slow.
WARNING: from element /GstPipeline:pipeline0/GstGLImageSinkBin:glimagesinkbin0/GstGLImageSink:sink: A lot of buffers are being dropped.
Additional debug info:
../git/libs/gst/base/gstbasesink.c(3136): gst_base_sink_is_too_late (): /GstPipeline:pipeline0/GstGLImageSinkBin:glimagesinkbin0/GstGLImageSink:sink:
There may be a timestamping problem, or this computer is too slow.
WARNING: from element /GstPipeline:pipeline0/GstGLImageSinkBin:glimagesinkbin0/GstGLImageSink:sink: A lot of buffers are being dropped.
Additional debug info:
../git/libs/gst/base/gstbasesink.c(3136): gst_base_sink_is_too_late (): /GstPipeline:pipeline0/GstGLImageSinkBin:glimagesinkbin0/GstGLImageSink:sink:
There may be a timestamping problem, or this computer is too slow.
WARNING: from element /GstPipeline:pipeline0/GstGLImageSinkBin:glimagesinkbin0/GstGLImageSink:sink: A lot of buffers are being dropped.
Additional debug info:
../git/libs/gst/base/gstbasesink.c(3136): gst_base_sink_is_too_late (): /GstPipeline:pipeline0/GstGLImageSinkBin:glimagesinkbin0/GstGLImageSink:sink:
There may be a timestamping problem, or this computer is too slow.
WARNING: from element /GstPipeline:pipeline0/GstGLImageSinkBin:glimagesinkbin0/GstGLImageSink:sink: A lot of buffers are being dropped.
Additional debug info:
../git/libs/gst/base/gstbasesink.c(3136): gst_base_sink_is_too_late (): /GstPipeline:pipeline0/GstGLImageSinkBin:glimagesinkbin0/GstGLImageSink:sink:
There may be a timestamping problem, or this computer is too slow.
WARNING: from element /GstPipeline:pipeline0/GstGLImageSinkBin:glimagesinkbin0/GstGLImageSink:sink: A lot of buffers are being dropped.
Additional debug info:
../git/libs/gst/base/gstbasesink.c(3136): gst_base_sink_is_too_late (): /GstPipeline:pipeline0/GstGLImageSinkBin:glimagesinkbin0/GstGLImageSink:sink:
There may be a timestamping problem, or this computer is too slow.
WARNING: from element /GstPipeline:pipeline0/GstGLImageSinkBin:glimagesinkbin0/GstGLImageSink:sink: A lot of buffers are being dropped.
Additional debug info:
../git/libs/gst/base/gstbasesink.c(3136): gst_base_sink_is_too_late (): /GstPipeline:pipeline0/GstGLImageSinkBin:glimagesinkbin0/GstGLImageSink:sink:
There may be a timestamping problem, or this computer is too slow.
WARNING: from element /GstPipeline:pipeline0/GstGLImageSinkBin:glimagesinkbin0/GstGLImageSink:sink: A lot of buffers are being dropped.
Additional debug info:
../git/libs/gst/base/gstbasesink.c(3136): gst_base_sink_is_too_late (): /GstPipeline:pipeline0/GstGLImageSinkBin:glimagesinkbin0/GstGLImageSink:sink:
There may be a timestamping problem, or this computer is too slow.
WARNING: from element /GstPipeline:pipeline0/GstGLImageSinkBin:glimagesinkbin0/GstGLImageSink:sink: A lot of buffers are being dropped.
Additional debug info:
../git/libs/gst/base/gstbasesink.c(3136): gst_base_sink_is_too_late (): /GstPipeline:pipeline0/GstGLImageSinkBin:glimagesinkbin0/GstGLImageSink:sink:
There may be a timestamping problem, or this computer is too slow.
WARNING: from element /GstPipeline:pipeline0/GstGLImageSinkBin:glimagesinkbin0/GstGLImageSink:sink: A lot of buffers are being dropped.
Additional debug info:
../git/libs/gst/base/gstbasesink.c(3136): gst_base_sink_is_too_late (): /GstPipeline:pipeline0/GstGLImageSinkBin:glimagesinkbin0/GstGLImageSink:sink:
There may be a timestamping problem, or this computer is too slow.
WARNING: from element /GstPipeline:pipeline0/GstGLImageSinkBin:glimagesinkbin0/GstGLImageSink:sink: A lot of buffers are being dropped.
Additional debug info:
../git/libs/gst/base/gstbasesink.c(3136): gst_base_sink_is_too_late (): /GstPipeline:pipeline0/GstGLImageSinkBin:glimagesinkbin0/GstGLImageSink:sink:
There may be a timestamping problem, or this computer is too slow.
WARNING: from element /GstPipeline:pipeline0/GstGLImageSinkBin:glimagesinkbin0/GstGLImageSink:sink: A lot of buffers are being dropped.
Additional debug info:
../git/libs/gst/base/gstbasesink.c(3136): gst_base_sink_is_too_late (): /GstPipeline:pipeline0/GstGLImageSinkBin:glimagesinkbin0/GstGLImageSink:sink:
There may be a timestamping problem, or this computer is too slow.
WARNING: from element /GstPipeline:pipeline0/GstGLImageSinkBin:glimagesinkbin0/GstGLImageSink:sink: A lot of buffers are being dropped.
Additional debug info:
../git/libs/gst/base/gstbasesink.c(3136): gst_base_sink_is_too_late (): /GstPipeline:pipeline0/GstGLImageSinkBin:glimagesinkbin0/GstGLImageSink:sink:
There may be a timestamping problem, or this computer is too slow.
WARNING: from element /GstPipeline:pipeline0/GstGLImageSinkBin:glimagesinkbin0/GstGLImageSink:sink: A lot of buffers are being dropped.
Additional debug info:
../git/libs/gst/base/gstbasesink.c(3136): gst_base_sink_is_too_late (): /GstPipeline:pipeline0/GstGLImageSinkBin:glimagesinkbin0/GstGLImageSink:sink:
There may be a timestamping problem, or this computer is too slow.
WARNING: from element /GstPipeline:pipeline0/GstGLImageSinkBin:glimagesinkbin0/GstGLImageSink:sink: A lot of buffers are being dropped.
Additional debug info:
../git/libs/gst/base/gstbasesink.c(3136): gst_base_sink_is_too_late (): /GstPipeline:pipeline0/GstGLImageSinkBin:glimagesinkbin0/GstGLImageSink:sink:
There may be a timestamping problem, or this computer is too slow.
WARNING: from element /GstPipeline:pipeline0/GstGLImageSinkBin:glimagesinkbin0/GstGLImageSink:sink: A lot of buffers are being dropped.
Additional debug info:
../git/libs/gst/base/gstbasesink.c(3136): gst_base_sink_is_too_late (): /GstPipeline:pipeline0/GstGLImageSinkBin:glimagesinkbin0/GstGLImageSink:sink:
There may be a timestamping problem, or this computer is too slow.
WARNING: from element /GstPipeline:pipeline0/GstGLImageSinkBin:glimagesinkbin0/GstGLImageSink:sink: A lot of buffers are being dropped.
Additional debug info:
../git/libs/gst/base/gstbasesink.c(3136): gst_base_sink_is_too_late (): /GstPipeline:pipeline0/GstGLImageSinkBin:glimagesinkbin0/GstGLImageSink:sink:
There may be a timestamping problem, or this computer is too slow.
WARNING: from element /GstPipeline:pipeline0/GstGLImageSinkBin:glimagesinkbin0/GstGLImageSink:sink: A lot of buffers are being dropped.
Additional debug info:
../git/libs/gst/base/gstbasesink.c(3136): gst_base_sink_is_too_late (): /GstPipeline:pipeline0/GstGLImageSinkBin:glimagesinkbin0/GstGLImageSink:sink:
There may be a timestamping problem, or this computer is too slow.
WARNING: from element /GstPipeline:pipeline0/GstGLImageSinkBin:glimagesinkbin0/GstGLImageSink:sink: A lot of buffers are being dropped.
Additional debug info:
../git/libs/gst/base/gstbasesink.c(3136): gst_base_sink_is_too_late (): /GstPipeline:pipeline0/GstGLImageSinkBin:glimagesinkbin0/GstGLImageSink:sink:
There may be a timestamping problem, or this computer is too slow.
WARNING: from element /GstPipeline:pipeline0/GstGLImageSinkBin:glimagesinkbin0/GstGLImageSink:sink: A lot of buffers are being dropped.
Additional debug info:
../git/libs/gst/base/gstbasesink.c(3136): gst_base_sink_is_too_late (): /GstPipeline:pipeline0/GstGLImageSinkBin:glimagesinkbin0/GstGLImageSink:sink:
There may be a timestamping problem, or this computer is too slow.
WARNING: from element /GstPipeline:pipeline0/GstGLImageSinkBin:glimagesinkbin0/GstGLImageSink:sink: A lot of buffers are being dropped.
Additional debug info:
../git/libs/gst/base/gstbasesink.c(3136): gst_base_sink_is_too_late (): /GstPipeline:pipeline0/GstGLImageSinkBin:glimagesinkbin0/GstGLImageSink:sink:
There may be a timestamping problem, or this computer is too slow.
WARNING: from element /GstPipeline:pipeline0/GstGLImageSinkBin:glimagesinkbin0/GstGLImageSink:sink: A lot of buffers are being dropped.
Additional debug info:
../git/libs/gst/base/gstbasesink.c(3136): gst_base_sink_is_too_late (): /GstPipeline:pipeline0/GstGLImageSinkBin:glimagesinkbin0/GstGLImageSink:sink:
There may be a timestamping problem, or this computer is too slow.
WARNING: from element /GstPipeline:pipeline0/GstGLImageSinkBin:glimagesinkbin0/GstGLImageSink:sink: A lot of buffers are being dropped.
Additional debug info:
../git/libs/gst/base/gstbasesink.c(3136): gst_base_sink_is_too_late (): /GstPipeline:pipeline0/GstGLImageSinkBin:glimagesinkbin0/GstGLImageSink:sink:
There may be a timestamping problem, or this computer is too slow.
WARNING: from element /GstPipeline:pipeline0/GstGLImageSinkBin:glimagesinkbin0/GstGLImageSink:sink: A lot of buffers are being dropped.
Additional debug info:
../git/libs/gst/base/gstbasesink.c(3136): gst_base_sink_is_too_late (): /GstPipeline:pipeline0/GstGLImageSinkBin:glimagesinkbin0/GstGLImageSink:sink:
There may be a timestamping problem, or this computer is too slow.
WARNING: from element /GstPipeline:pipeline0/GstGLImageSinkBin:glimagesinkbin0/GstGLImageSink:sink: A lot of buffers are being dropped.
Additional debug info:
../git/libs/gst/base/gstbasesink.c(3136): gst_base_sink_is_too_late (): /GstPipeline:pipeline0/GstGLImageSinkBin:glimagesinkbin0/GstGLImageSink:sink:
There may be a timestamping problem, or this computer is too slow.
WARNING: from element /GstPipeline:pipeline0/GstGLImageSinkBin:glimagesinkbin0/GstGLImageSink:sink: A lot of buffers are being dropped.
Additional debug info:
../git/libs/gst/base/gstbasesink.c(3136): gst_base_sink_is_too_late (): /GstPipeline:pipeline0/GstGLImageSinkBin:glimagesinkbin0/GstGLImageSink:sink:
There may be a timestamping problem, or this computer is too slow.
^Chandling interrupt.
Interrupt: Stopping pipeline ...
Execution ended after 0:00:19.281492055
Setting pipeline to NULL ...
Total showed frames (22), playing for (0:00:19.281454548), fps (1.141).
Freeing pipeline ...&lt;/LI-CODE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;And it seems that there's only stream of 1st camera to be showed on the display for 1) &amp;amp; 2).&lt;/P&gt;&lt;P&gt;Also, could you share the gstreamer setting or VLC setting to visualize the stream on the PC side over the network, please ?&lt;/P&gt;&lt;P&gt;Best Regards,&lt;BR /&gt;Khang&lt;/P&gt;</description>
    <pubDate>Mon, 19 Dec 2022 03:24:05 GMT</pubDate>
    <dc:creator>khang_letruong</dc:creator>
    <dc:date>2022-12-19T03:24:05Z</dc:date>
    <item>
      <title>[iMX8MPlus] How to mix two video sources into one with Gstreamer ?</title>
      <link>https://community.nxp.com/t5/i-MX-Processors/iMX8MPlus-How-to-mix-two-video-sources-into-one-with-Gstreamer/m-p/1531513#M195897</link>
      <description>&lt;P&gt;Dear Community,&lt;/P&gt;&lt;P&gt;Gstreamer has the notion of Composition for outputting multiple video displays but I need to do the opposite way:&amp;nbsp; I would like to mix/combine two H264-encoded videos (from /dev/video0 and /dev/video1 for example) from iMX8M Plus. Is it possible with Gstreamer, please ?&lt;/P&gt;&lt;P&gt;Best regards,&lt;BR /&gt;Khang&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Mon, 03 Oct 2022 08:31:01 GMT</pubDate>
      <guid>https://community.nxp.com/t5/i-MX-Processors/iMX8MPlus-How-to-mix-two-video-sources-into-one-with-Gstreamer/m-p/1531513#M195897</guid>
      <dc:creator>khang_letruong</dc:creator>
      <dc:date>2022-10-03T08:31:01Z</dc:date>
    </item>
    <item>
      <title>Re: [iMX8MPlus] How to mix two video sources into one with Gstreamer ?</title>
      <link>https://community.nxp.com/t5/i-MX-Processors/iMX8MPlus-How-to-mix-two-video-sources-into-one-with-Gstreamer/m-p/1531942#M195932</link>
      <description>&lt;P&gt;what do you mean mix two video? display two video in the same display? overlay? what's your use case&lt;/P&gt;</description>
      <pubDate>Tue, 04 Oct 2022 06:14:34 GMT</pubDate>
      <guid>https://community.nxp.com/t5/i-MX-Processors/iMX8MPlus-How-to-mix-two-video-sources-into-one-with-Gstreamer/m-p/1531942#M195932</guid>
      <dc:creator>joanxie</dc:creator>
      <dc:date>2022-10-04T06:14:34Z</dc:date>
    </item>
    <item>
      <title>Re: [iMX8MPlus] How to mix two video sources into one with Gstreamer ?</title>
      <link>https://community.nxp.com/t5/i-MX-Processors/iMX8MPlus-How-to-mix-two-video-sources-into-one-with-Gstreamer/m-p/1531954#M195937</link>
      <description>&lt;P&gt;Dear &lt;a href="https://community.nxp.com/t5/user/viewprofilepage/user-id/39586"&gt;@joanxie&lt;/a&gt; ,&lt;/P&gt;&lt;P&gt;My use-case is trying to read 2 (hardware synchronized) camera sensors (/dev/video0, /dev/video1) at same time as possible, encoding, transferring over the network and displaying them in same gstreamer windows on client side. The resolution of each sensor is 1440x1080.&lt;/P&gt;&lt;P&gt;By hardware synchronization, I means one camera acts as master, the other acts as slave to guarantee the capturing period.&lt;/P&gt;&lt;P&gt;Best Regards,&lt;BR /&gt;Khang&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Tue, 04 Oct 2022 07:07:30 GMT</pubDate>
      <guid>https://community.nxp.com/t5/i-MX-Processors/iMX8MPlus-How-to-mix-two-video-sources-into-one-with-Gstreamer/m-p/1531954#M195937</guid>
      <dc:creator>khang_letruong</dc:creator>
      <dc:date>2022-10-04T07:07:30Z</dc:date>
    </item>
    <item>
      <title>Re: [iMX8MPlus] How to mix two video sources into one with Gstreamer ?</title>
      <link>https://community.nxp.com/t5/i-MX-Processors/iMX8MPlus-How-to-mix-two-video-sources-into-one-with-Gstreamer/m-p/1531960#M195939</link>
      <description>&lt;P&gt;Hi Khang,&lt;BR /&gt;For me this sounds more like a synchronisation issue on the client playback side.&lt;/P&gt;&lt;P&gt;However, one can embed several h264 elementary streams into a single multiplexing format. If I remember correctly, matroska (MKV) allows that.&lt;/P&gt;&lt;P&gt;Not sure all clients can render this as you want.&lt;/P&gt;&lt;P&gt;IMHO, you will have more success with sending the two h264 streams separately and embed their respective timestamps. Timestamps should have same relative time base.&lt;/P&gt;&lt;P&gt;Gstreamer client can then playback these two streams in a synchronized manner.&lt;/P&gt;&lt;P&gt;Long story short, I would put the effort on the decoding side and not on the encoder.&lt;/P&gt;</description>
      <pubDate>Tue, 04 Oct 2022 07:20:17 GMT</pubDate>
      <guid>https://community.nxp.com/t5/i-MX-Processors/iMX8MPlus-How-to-mix-two-video-sources-into-one-with-Gstreamer/m-p/1531960#M195939</guid>
      <dc:creator>malik_cisse</dc:creator>
      <dc:date>2022-10-04T07:20:17Z</dc:date>
    </item>
    <item>
      <title>Re: [iMX8MPlus] How to mix two video sources into one with Gstreamer ?</title>
      <link>https://community.nxp.com/t5/i-MX-Processors/iMX8MPlus-How-to-mix-two-video-sources-into-one-with-Gstreamer/m-p/1531968#M195942</link>
      <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://community.nxp.com/t5/user/viewprofilepage/user-id/147542"&gt;@khang_letruong&lt;/a&gt;&amp;nbsp;,&lt;/P&gt;
&lt;P&gt;So you want a GStreamer pipeline to compose your two camera streams and stream the composed data over network, right?&lt;/P&gt;
&lt;BLOCKQUOTE&gt;
&lt;P&gt;gst-launch-1.0 imxcompositor_g2d name=comp \&lt;BR /&gt;sink_0::xpos=0 sink_0::ypos=0 sink_0::width=640 sink_0::height=480 \&lt;BR /&gt;sink_1::xpos=0 sink_1::ypos=480 sink_1::width=640 sink_1::height=480 \&lt;BR /&gt;! queue ! videoconvert ! v4l2h264enc ! rtph264pay config-interval=1 pt=96 ! udpsink host=169.254.235.127 port=5000 \&lt;BR /&gt;v4l2src device=/dev/video2 ! imxvideoconvert_g2d ! video/x-raw, width=640, height=480 ! queue ! comp.sink_0 \&lt;BR /&gt;v4l2src device=/dev/video3 ! imxvideoconvert_g2d ! video/x-raw, width=640, height=480 ! queue ! comp.sink_1&lt;/P&gt;
&lt;/BLOCKQUOTE&gt;
&lt;P&gt;On the client side, the output, with a latency:&lt;/P&gt;
&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="dianapredescu_0-1664868715131.png" style="width: 400px;"&gt;&lt;img src="https://community.nxp.com/t5/image/serverpage/image-id/195669iECACCF15C744E240/image-size/medium?v=v2&amp;amp;px=400" role="button" title="dianapredescu_0-1664868715131.png" alt="dianapredescu_0-1664868715131.png" /&gt;&lt;/span&gt;&lt;/P&gt;
&lt;P&gt;Is this helpful?&lt;/P&gt;
&lt;P&gt;Diana&lt;/P&gt;</description>
      <pubDate>Tue, 04 Oct 2022 07:33:37 GMT</pubDate>
      <guid>https://community.nxp.com/t5/i-MX-Processors/iMX8MPlus-How-to-mix-two-video-sources-into-one-with-Gstreamer/m-p/1531968#M195942</guid>
      <dc:creator>dianapredescu</dc:creator>
      <dc:date>2022-10-04T07:33:37Z</dc:date>
    </item>
    <item>
      <title>Re: [iMX8MPlus] How to mix two video sources into one with Gstreamer ?</title>
      <link>https://community.nxp.com/t5/i-MX-Processors/iMX8MPlus-How-to-mix-two-video-sources-into-one-with-Gstreamer/m-p/1531969#M195943</link>
      <description>&lt;P&gt;Hi &lt;a href="https://community.nxp.com/t5/user/viewprofilepage/user-id/109210"&gt;@malik_cisse&lt;/a&gt; ,&lt;/P&gt;&lt;BLOCKQUOTE&gt;&lt;P&gt;MHO, you will have more success with sending the two h264 streams separately and embed their respective timestamps. Timestamps should have same relative time base&lt;/P&gt;&lt;/BLOCKQUOTE&gt;&lt;P&gt;Do you mean that gstreamer could provide the timestamps ?&lt;/P&gt;&lt;P&gt;My observation is that even I launch 2 separate gst-launch commands (H264 enabled) with &lt;STRONG&gt;&amp;amp;&lt;/STRONG&gt; on the board as well as on the client machine, there's still issue with arriving frames: they are not sync.&lt;/P&gt;&lt;P&gt;Best Regards,&lt;BR /&gt;Khang&lt;/P&gt;</description>
      <pubDate>Tue, 04 Oct 2022 07:38:08 GMT</pubDate>
      <guid>https://community.nxp.com/t5/i-MX-Processors/iMX8MPlus-How-to-mix-two-video-sources-into-one-with-Gstreamer/m-p/1531969#M195943</guid>
      <dc:creator>khang_letruong</dc:creator>
      <dc:date>2022-10-04T07:38:08Z</dc:date>
    </item>
    <item>
      <title>Re: [iMX8MPlus] How to mix two video sources into one with Gstreamer ?</title>
      <link>https://community.nxp.com/t5/i-MX-Processors/iMX8MPlus-How-to-mix-two-video-sources-into-one-with-Gstreamer/m-p/1531973#M195944</link>
      <description>&lt;P&gt;Hi &lt;a href="https://community.nxp.com/t5/user/viewprofilepage/user-id/183729"&gt;@dianapredescu&lt;/a&gt; ,&lt;/P&gt;&lt;P&gt;Thanks for your suggestion. It is nearly what I need. I would like to know if &lt;STRONG&gt;&lt;SPAN class=""&gt;&lt;SPAN class=""&gt;vpuenc_h264&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/STRONG&gt; could be used instead of &lt;STRONG&gt;v4l2h264enc&lt;/STRONG&gt;&amp;nbsp;&lt;SPAN&gt;&lt;SPAN class=""&gt; and the resolution would be 1440x1080 or 1920x1080 instead of 640x480, please ?&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;&lt;SPAN class=""&gt;For separate streams, I used : &lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;gst-launch-1.0 -v v4l2src device=/dev/video1 ! imxvideoconvert_g2d ! "video/x-raw, width=1440, height=1080, framerate=30/1" ! vpuenc_h264 ! rtph264pay pt=96 ! rtpstreampay ! tcpserversink host=192.168.110.8 port=5001 blocksize=512000 sync=false &amp;amp; \
gst-launch-1.0 -v v4l2src device=/dev/video0 ! imxvideoconvert_g2d ! "video/x-raw, width=1440, height=1080, framerate=30/1" ! vpuenc_h264 ! rtph264pay pt=96 ! rtpstreampay ! tcpserversink host=192.168.110.8 port=5000 blocksize=512000 sync=false&lt;/LI-CODE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;&lt;SPAN class=""&gt;Best Regards,&lt;BR /&gt;K.&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/P&gt;</description>
      <pubDate>Tue, 04 Oct 2022 07:49:33 GMT</pubDate>
      <guid>https://community.nxp.com/t5/i-MX-Processors/iMX8MPlus-How-to-mix-two-video-sources-into-one-with-Gstreamer/m-p/1531973#M195944</guid>
      <dc:creator>khang_letruong</dc:creator>
      <dc:date>2022-10-04T07:49:33Z</dc:date>
    </item>
    <item>
      <title>Re: [iMX8MPlus] How to mix two video sources into one with Gstreamer ?</title>
      <link>https://community.nxp.com/t5/i-MX-Processors/iMX8MPlus-How-to-mix-two-video-sources-into-one-with-Gstreamer/m-p/1532107#M195949</link>
      <description>&lt;P&gt;Dear&amp;nbsp;&lt;a href="https://community.nxp.com/t5/user/viewprofilepage/user-id/147542"&gt;@khang_letruong&lt;/a&gt;&amp;nbsp;,&lt;/P&gt;
&lt;P&gt;Please let me try it on my side.&lt;/P&gt;
&lt;P&gt;But at the first glance, I'm not sure vpuenc_h264 will work on my scenario because vpuenc_h264 seems to support only up to 1920x1088. In my case, at the end I would have a 1920x2160 stream.&lt;/P&gt;
&lt;BLOCKQUOTE&gt;
&lt;P&gt;SINK template: 'sink'&lt;BR /&gt;Availability: Always&lt;BR /&gt;Capabilities:&lt;BR /&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;video/x-raw&lt;BR /&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;format: { (string)NV12, (string)I420, (string)YUY2, (string)UYVY, (string)RGBA, (string)RGBx, (string)RGB16, (string)RGB15, (string)BGRA, (string)BGRx, (string)BGR16 }&lt;BR /&gt;&amp;nbsp; &amp;nbsp;&lt;STRONG&gt; &amp;nbsp; &amp;nbsp;width: [ 64, 1920 ]&lt;/STRONG&gt;&lt;BR /&gt;&lt;STRONG&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;height: [ 64, 1088 ]&lt;/STRONG&gt;&lt;BR /&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;framerate: [ 0/1, 2147483647/1 ]&lt;/P&gt;
&lt;/BLOCKQUOTE&gt;
&lt;P&gt;Best regards,&lt;/P&gt;
&lt;P&gt;Diana&lt;/P&gt;</description>
      <pubDate>Tue, 04 Oct 2022 12:02:42 GMT</pubDate>
      <guid>https://community.nxp.com/t5/i-MX-Processors/iMX8MPlus-How-to-mix-two-video-sources-into-one-with-Gstreamer/m-p/1532107#M195949</guid>
      <dc:creator>dianapredescu</dc:creator>
      <dc:date>2022-10-04T12:02:42Z</dc:date>
    </item>
    <item>
      <title>Re: [iMX8MPlus] How to mix two video sources into one with Gstreamer ?</title>
      <link>https://community.nxp.com/t5/i-MX-Processors/iMX8MPlus-How-to-mix-two-video-sources-into-one-with-Gstreamer/m-p/1532117#M195951</link>
      <description>&lt;P&gt;Dear &lt;a href="https://community.nxp.com/t5/user/viewprofilepage/user-id/183729"&gt;@dianapredescu&lt;/a&gt; ,&lt;/P&gt;&lt;P&gt;You can test 1920x1080 which is popular. However I confirm that vpuenc_h264 works with 1440x1080 which is the resolution of my sensors and vpuenc_h264 element or the hardware accelerated H264 encoder is one of the main interests of iMX8MP.&lt;/P&gt;&lt;P&gt;Regards,&lt;BR /&gt;Khang&lt;/P&gt;</description>
      <pubDate>Tue, 04 Oct 2022 13:32:23 GMT</pubDate>
      <guid>https://community.nxp.com/t5/i-MX-Processors/iMX8MPlus-How-to-mix-two-video-sources-into-one-with-Gstreamer/m-p/1532117#M195951</guid>
      <dc:creator>khang_letruong</dc:creator>
      <dc:date>2022-10-04T13:32:23Z</dc:date>
    </item>
    <item>
      <title>Re: [iMX8MPlus] How to mix two video sources into one with Gstreamer ?</title>
      <link>https://community.nxp.com/t5/i-MX-Processors/iMX8MPlus-How-to-mix-two-video-sources-into-one-with-Gstreamer/m-p/1532126#M195953</link>
      <description>&lt;P&gt;Dianna's Solution indeed looks promising.&lt;/P&gt;&lt;P&gt;On the other side, H264 elementary stream itself does not embed time stamps, however, RTP, RTSP or Mpeg-TS does.&lt;/P&gt;</description>
      <pubDate>Tue, 04 Oct 2022 13:14:24 GMT</pubDate>
      <guid>https://community.nxp.com/t5/i-MX-Processors/iMX8MPlus-How-to-mix-two-video-sources-into-one-with-Gstreamer/m-p/1532126#M195953</guid>
      <dc:creator>malik_cisse</dc:creator>
      <dc:date>2022-10-04T13:14:24Z</dc:date>
    </item>
    <item>
      <title>Re: [iMX8MPlus] How to mix two video sources into one with Gstreamer ?</title>
      <link>https://community.nxp.com/t5/i-MX-Processors/iMX8MPlus-How-to-mix-two-video-sources-into-one-with-Gstreamer/m-p/1532468#M195981</link>
      <description>&lt;P&gt;Dear &lt;a href="https://community.nxp.com/t5/user/viewprofilepage/user-id/183729"&gt;@dianapredescu&lt;/a&gt; .&lt;/P&gt;&lt;P&gt;Could you please also explain why you said "... at the end I would have a 1920x2160 stream" from your given example?&lt;/P&gt;&lt;P&gt;Thanks in advance and best regards,&lt;/P&gt;&lt;P&gt;Khang&lt;/P&gt;</description>
      <pubDate>Wed, 05 Oct 2022 06:44:52 GMT</pubDate>
      <guid>https://community.nxp.com/t5/i-MX-Processors/iMX8MPlus-How-to-mix-two-video-sources-into-one-with-Gstreamer/m-p/1532468#M195981</guid>
      <dc:creator>khang_letruong</dc:creator>
      <dc:date>2022-10-05T06:44:52Z</dc:date>
    </item>
    <item>
      <title>Re: [iMX8MPlus] How to mix two video sources into one with Gstreamer ?</title>
      <link>https://community.nxp.com/t5/i-MX-Processors/iMX8MPlus-How-to-mix-two-video-sources-into-one-with-Gstreamer/m-p/1532561#M195991</link>
      <description>&lt;P&gt;Khang,&amp;nbsp;&lt;/P&gt;
&lt;P&gt;I was thinking, if I compose two 1920x1080 (one above the other)&amp;nbsp; before encoding and streaming it over the network, I would have a window of 1920x2160 resolution, right? Were you saying that I can rescale the output window?&lt;/P&gt;
&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="dianapredescu_0-1664967500942.png" style="width: 400px;"&gt;&lt;img src="https://community.nxp.com/t5/image/serverpage/image-id/195770iA691F5E67D96C34E/image-size/medium?v=v2&amp;amp;px=400" role="button" title="dianapredescu_0-1664967500942.png" alt="dianapredescu_0-1664967500942.png" /&gt;&lt;/span&gt;&lt;/P&gt;
&lt;P&gt;Best regards,&lt;/P&gt;
&lt;P&gt;Diana&lt;/P&gt;</description>
      <pubDate>Wed, 05 Oct 2022 11:00:13 GMT</pubDate>
      <guid>https://community.nxp.com/t5/i-MX-Processors/iMX8MPlus-How-to-mix-two-video-sources-into-one-with-Gstreamer/m-p/1532561#M195991</guid>
      <dc:creator>dianapredescu</dc:creator>
      <dc:date>2022-10-05T11:00:13Z</dc:date>
    </item>
    <item>
      <title>Re: [iMX8MPlus] How to mix two video sources into one with Gstreamer ?</title>
      <link>https://community.nxp.com/t5/i-MX-Processors/iMX8MPlus-How-to-mix-two-video-sources-into-one-with-Gstreamer/m-p/1532578#M195993</link>
      <description>&lt;P&gt;Hi &lt;a href="https://community.nxp.com/t5/user/viewprofilepage/user-id/183729"&gt;@dianapredescu&lt;/a&gt; ,&lt;/P&gt;&lt;P&gt;I was referring to your example of 640x480 and was wondering how you could have 1920x2160 instead.&lt;/P&gt;&lt;P&gt;Also, 1920x2160 would exceed the capacity of single encoder but would not exceed the capacity of dual encoders. My question is that both encoders participate to encode this composition or just single encoder ?&lt;/P&gt;&lt;P&gt;Best,&lt;/P&gt;&lt;P&gt;K.&lt;/P&gt;</description>
      <pubDate>Wed, 05 Oct 2022 11:41:42 GMT</pubDate>
      <guid>https://community.nxp.com/t5/i-MX-Processors/iMX8MPlus-How-to-mix-two-video-sources-into-one-with-Gstreamer/m-p/1532578#M195993</guid>
      <dc:creator>khang_letruong</dc:creator>
      <dc:date>2022-10-05T11:41:42Z</dc:date>
    </item>
    <item>
      <title>Re: [iMX8MPlus] How to mix two video sources into one with Gstreamer ?</title>
      <link>https://community.nxp.com/t5/i-MX-Processors/iMX8MPlus-How-to-mix-two-video-sources-into-one-with-Gstreamer/m-p/1532650#M196005</link>
      <description>&lt;P&gt;Dear&amp;nbsp;&lt;a href="https://community.nxp.com/t5/user/viewprofilepage/user-id/147542"&gt;@khang_letruong&lt;/a&gt;&amp;nbsp;,&lt;/P&gt;
&lt;P&gt;sorry for mixing things. No, in the case I previously tested with two 640x480 stream, I had on client side a 640x960 video window. The 1920x2160 window came into the discussion from the scenario of using two 1920x1080 video streams. (This is what I'm currently investigating&amp;nbsp; - composition of 2 FHD streams)&lt;/P&gt;
&lt;P&gt;In the latest BSP I've seen we have modified the vpu capabilities. It looks like with the latest BSPs one can encode more than 1920x1080 with v4l2h264enc. (&amp;nbsp;&lt;A href="https://github.com/nxp-imx/linux-imx/blob/lf-5.15.y/drivers/mxc/hantro_v4l2/vsi-v4l2-config.c#L755" target="_blank"&gt;https://github.com/nxp-imx/linux-imx/blob/lf-5.15.y/drivers/mxc/hantro_v4l2/vsi-v4l2-config.c#L755&lt;/A&gt;&amp;nbsp;)&lt;/P&gt;
&lt;P&gt;So I would understand that I should be able to encode 1920x2160 with one instance, right?&lt;/P&gt;
&lt;P&gt;Best regards,&lt;/P&gt;
&lt;P&gt;Diana&lt;/P&gt;</description>
      <pubDate>Wed, 05 Oct 2022 13:35:36 GMT</pubDate>
      <guid>https://community.nxp.com/t5/i-MX-Processors/iMX8MPlus-How-to-mix-two-video-sources-into-one-with-Gstreamer/m-p/1532650#M196005</guid>
      <dc:creator>dianapredescu</dc:creator>
      <dc:date>2022-10-05T13:35:36Z</dc:date>
    </item>
    <item>
      <title>Re: [iMX8MPlus] How to mix two video sources into one with Gstreamer ?</title>
      <link>https://community.nxp.com/t5/i-MX-Processors/iMX8MPlus-How-to-mix-two-video-sources-into-one-with-Gstreamer/m-p/1532774#M196011</link>
      <description>&lt;P&gt;Dear &lt;a href="https://community.nxp.com/t5/user/viewprofilepage/user-id/183729"&gt;@dianapredescu&lt;/a&gt; ,&lt;/P&gt;&lt;P&gt;Thanks for pointing out the modification of VPU driver in latest BSP.&lt;/P&gt;&lt;BLOCKQUOTE&gt;&lt;P&gt;So I would understand that I should be able to encode 1920x2160 with one instance, right?&lt;/P&gt;&lt;/BLOCKQUOTE&gt;&lt;P&gt;Effectively, this is also the question that I am looking for the answer from you NXP who released the BSP. As the spec of iMX8MP said that max. encoding resolution is 1080p, that why I would like to know how many encoders participate into the pipeline of your example?&lt;/P&gt;&lt;P&gt;Discussion about the encoding limitation of VPU here : &lt;A href="https://community.nxp.com/t5/i-MX-Processors/IMX8M-Plus-Hantro-4K-Encoder/m-p/1282890" target="_blank"&gt;https://community.nxp.com/t5/i-MX-Processors/IMX8M-Plus-Hantro-4K-Encoder/m-p/1282890&lt;/A&gt;&lt;/P&gt;&lt;P&gt;Best Regards,&lt;BR /&gt;Khang&lt;/P&gt;</description>
      <pubDate>Wed, 05 Oct 2022 16:38:07 GMT</pubDate>
      <guid>https://community.nxp.com/t5/i-MX-Processors/iMX8MPlus-How-to-mix-two-video-sources-into-one-with-Gstreamer/m-p/1532774#M196011</guid>
      <dc:creator>khang_letruong</dc:creator>
      <dc:date>2022-10-05T16:38:07Z</dc:date>
    </item>
    <item>
      <title>Re: [iMX8MPlus] How to mix two video sources into one with Gstreamer ?</title>
      <link>https://community.nxp.com/t5/i-MX-Processors/iMX8MPlus-How-to-mix-two-video-sources-into-one-with-Gstreamer/m-p/1534837#M196157</link>
      <description>&lt;P&gt;Dear&amp;nbsp;&lt;a href="https://community.nxp.com/t5/user/viewprofilepage/user-id/147542"&gt;@khang_letruong&lt;/a&gt;,&lt;/P&gt;
&lt;P&gt;tried to run few examples for both 1920x1080 and your custom resolution 1440x1080, and here is some feedback.&lt;/P&gt;
&lt;P&gt;&lt;STRONG&gt;NXP commitment&lt;/STRONG&gt; for best performances for VPU encoding is &lt;STRONG&gt;1920x1080&lt;/STRONG&gt;, as stated in Reference Manual, even though the IP HW can support more than 1080p. As seen in the driver, 8MP maximum encoding resolution is 1920x8192 (8MP introduces HW limitation on width size) . One encoding pipeline only occupies one instance.&lt;/P&gt;
&lt;P&gt;So adapting the pipeline I've previously sent for your use case (encoding 1 x 1440x2160 output) appears to be possible, but the performances do not look so good. I suggest to try it on your side and let me know how you evaluate it.&lt;/P&gt;
&lt;P&gt;Best regards,&lt;/P&gt;
&lt;P&gt;Diana&lt;/P&gt;</description>
      <pubDate>Mon, 10 Oct 2022 11:57:39 GMT</pubDate>
      <guid>https://community.nxp.com/t5/i-MX-Processors/iMX8MPlus-How-to-mix-two-video-sources-into-one-with-Gstreamer/m-p/1534837#M196157</guid>
      <dc:creator>dianapredescu</dc:creator>
      <dc:date>2022-10-10T11:57:39Z</dc:date>
    </item>
    <item>
      <title>Re: [iMX8MPlus] How to mix two video sources into one with Gstreamer ?</title>
      <link>https://community.nxp.com/t5/i-MX-Processors/iMX8MPlus-How-to-mix-two-video-sources-into-one-with-Gstreamer/m-p/1534864#M196160</link>
      <description>&lt;P&gt;Hi &lt;a href="https://community.nxp.com/t5/user/viewprofilepage/user-id/183729"&gt;@dianapredescu&lt;/a&gt; ,&lt;/P&gt;&lt;P&gt;Thanks for your thorough explanation. Another confusion was &lt;STRONG&gt;v4l2h264enc&lt;/STRONG&gt; vs &lt;STRONG&gt;&lt;SPAN class=""&gt;vpuenc_h264&lt;/SPAN&gt;&lt;/STRONG&gt; as per following question :&lt;/P&gt;&lt;BLOCKQUOTE&gt;&lt;P&gt;Thanks for your suggestion. It is nearly what I need. I would like to know if &lt;STRONG&gt;&lt;SPAN class=""&gt;vpuenc_h264&lt;/SPAN&gt;&lt;/STRONG&gt; could be used instead of &lt;STRONG&gt;v4l2h264enc&lt;/STRONG&gt;&amp;nbsp;&lt;SPAN&gt;&lt;SPAN class=""&gt; and the resolution would be 1440x1080 or 1920x1080 instead of 640x480, please ?&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/P&gt;&lt;/BLOCKQUOTE&gt;&lt;P&gt;I was always thinking that&amp;nbsp;&lt;STRONG&gt;&lt;SPAN class=""&gt;vpuenc_h264&lt;/SPAN&gt;&lt;/STRONG&gt; was H264 hardware accelerated encoding element of iMX8M Plus but it turned out that it was not in following doc:&lt;/P&gt;&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="nxf65025_1-1647929849107.png" style="width: 999px;"&gt;&lt;img src="https://community.nxp.com/t5/image/serverpage/image-id/196261iEBC62B8011319685/image-size/large?v=v2&amp;amp;px=999" role="button" title="nxf65025_1-1647929849107.png" alt="nxf65025_1-1647929849107.png" /&gt;&lt;/span&gt;&lt;/P&gt;&lt;P&gt;Regards,&lt;BR /&gt;K.&lt;/P&gt;</description>
      <pubDate>Mon, 10 Oct 2022 12:54:19 GMT</pubDate>
      <guid>https://community.nxp.com/t5/i-MX-Processors/iMX8MPlus-How-to-mix-two-video-sources-into-one-with-Gstreamer/m-p/1534864#M196160</guid>
      <dc:creator>khang_letruong</dc:creator>
      <dc:date>2022-10-10T12:54:19Z</dc:date>
    </item>
    <item>
      <title>Re: [iMX8MPlus] How to mix two video sources into one with Gstreamer ?</title>
      <link>https://community.nxp.com/t5/i-MX-Processors/iMX8MPlus-How-to-mix-two-video-sources-into-one-with-Gstreamer/m-p/1534894#M196163</link>
      <description>&lt;P&gt;Dear &lt;a href="https://community.nxp.com/t5/user/viewprofilepage/user-id/147542"&gt;@khang_letruong&lt;/a&gt;,&lt;/P&gt;
&lt;P&gt;I understand where the confusion comes from. When we release 8MP platform, I believe vpudec and vpuenc_h264 were the only codecs we provided. They were HW accelerated. But we used those because the v4l2 software support wasn't ready for VPU decoder/encoder at that time. Now, NXP recommends using the latest VPU drivers: &lt;STRONG&gt;v4l2h264enc&lt;/STRONG&gt; &amp;amp; &lt;STRONG&gt;v4l2h264dec&lt;/STRONG&gt;.(that's why v4l2 codecs follows hantro supported resolution matrix -&amp;gt; max. 1920x4096 for H264 decoding and 1920x8192 for H264 encoding).&lt;/P&gt;
&lt;P&gt;Since we are using open source v4l2 framework and we exposed the full capability, any customer can make use of it. But NXP official commitment will remain 1080p for both encoding/decoding.&lt;/P&gt;
&lt;P&gt;Best regards,&lt;/P&gt;
&lt;P&gt;Diana&lt;/P&gt;</description>
      <pubDate>Mon, 10 Oct 2022 13:29:40 GMT</pubDate>
      <guid>https://community.nxp.com/t5/i-MX-Processors/iMX8MPlus-How-to-mix-two-video-sources-into-one-with-Gstreamer/m-p/1534894#M196163</guid>
      <dc:creator>dianapredescu</dc:creator>
      <dc:date>2022-10-10T13:29:40Z</dc:date>
    </item>
    <item>
      <title>Re: [iMX8MPlus] How to mix two video sources into one with Gstreamer ?</title>
      <link>https://community.nxp.com/t5/i-MX-Processors/iMX8MPlus-How-to-mix-two-video-sources-into-one-with-Gstreamer/m-p/1534910#M196165</link>
      <description>&lt;P&gt;Hi &lt;a href="https://community.nxp.com/t5/user/viewprofilepage/user-id/183729"&gt;@dianapredescu&lt;/a&gt; ,&lt;/P&gt;&lt;P&gt;I did not know from which BSP that &lt;STRONG&gt;v4l2h264dec&lt;/STRONG&gt; and &lt;STRONG&gt;v4l2h264enc &lt;/STRONG&gt;were introduced. I need to re-check the release notes and associated documents.&lt;/P&gt;&lt;P&gt;However, thanks for all.&lt;/P&gt;&lt;P&gt;Best,&lt;/P&gt;&lt;P&gt;K.&lt;/P&gt;</description>
      <pubDate>Mon, 10 Oct 2022 13:42:06 GMT</pubDate>
      <guid>https://community.nxp.com/t5/i-MX-Processors/iMX8MPlus-How-to-mix-two-video-sources-into-one-with-Gstreamer/m-p/1534910#M196165</guid>
      <dc:creator>khang_letruong</dc:creator>
      <dc:date>2022-10-10T13:42:06Z</dc:date>
    </item>
    <item>
      <title>Re: [iMX8MPlus] How to mix two video sources into one with Gstreamer ?</title>
      <link>https://community.nxp.com/t5/i-MX-Processors/iMX8MPlus-How-to-mix-two-video-sources-into-one-with-Gstreamer/m-p/1535231#M196186</link>
      <description>&lt;P&gt;Hi &lt;a href="https://community.nxp.com/t5/user/viewprofilepage/user-id/183729"&gt;@dianapredescu&lt;/a&gt; ,&lt;/P&gt;&lt;P&gt;For confirmation, the v4l2h264dec and v4l2h264enc have been introduced since BSP 5.10.35 while I am still sticked with BSP-5.4.70 in which there's only vpuenc_h264 and vpudec_h264.&lt;/P&gt;&lt;P&gt;Regards,&lt;BR /&gt;K.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Tue, 11 Oct 2022 02:45:21 GMT</pubDate>
      <guid>https://community.nxp.com/t5/i-MX-Processors/iMX8MPlus-How-to-mix-two-video-sources-into-one-with-Gstreamer/m-p/1535231#M196186</guid>
      <dc:creator>khang_letruong</dc:creator>
      <dc:date>2022-10-11T02:45:21Z</dc:date>
    </item>
  </channel>
</rss>

