I have been having some challenges creating a fully functioning Gstreamer application.
I have prototyped all the functions using gst-launch-1.0, and have had success, but my final solution needs to be compiled into an application.
I have been able to get every single element in my test pipelines to function in a compiled application, the issue is that I can't get all of the elements to work together.
For example, I have written the following pipeline into my application:
gst-launch-1.0 rtspsrc location=rtsp://10.1.0.92:554/MainStream ! rtph264depay ! h264parse ! imxvpudec ! imxeglvivsink borderless-window=true sync=false
After sorting out sometimes pads, my application works great.
I have also compiled this into my application:
gst-launch-1.0 -v imxipucompositor name=comp \
sink_0::xpos=0 sink_0::ypos=0 \
sink_1::xpos=512 sink_1::ypos=396 \
! video/x-raw,width=1024,height=768 ! imxeglvivsink native-display=:0 borderless-window=true sync=false \
videotestsrc pattern=0 ! video/x-raw,width=512,height=384 ! queue ! comp.sink_0 \
videotestsrc pattern=1 ! video/x-raw,width=512,height=384 ! queue ! comp.sink_1
After sorting out request pads, my application works great.
I have also compiled this into my application:
gst-launch-1.0 -v $MIXER name=comp \
sink_0::xpos=0 sink_0::ypos=0 sink_0::width=512 sink_0::height=288 \
sink_1::xpos=512 sink_1::ypos=0 sink_1::width=512 sink_1::height=288 \
sink_2::xpos=0 sink_2::ypos=288 sink_2::width=512 sink_2::height=288 \
sink_3::xpos=512 sink_3::ypos=288 sink_3::width=512 sink_3::height=288 \
! video/x-raw,width=1024,height=576 ! imxipuvideotransform ! imxeglvivsink borderless-window=true sync=false \
rtspsrc location=rtsp://10.1.0.92:554/MainStream ! rtph264depay ! h264parse ! imxvpudec ! imxipuvideotransform ! video/x-raw,width=512,height=288 ! queue ! comp.sink_0
rtspsrc location=rtsp://10.1.0.94:554/MainStream ! rtph264depay ! h264parse ! imxvpudec ! imxipuvideotransform ! video/x-raw,width=512,height=288 ! queue ! comp.sink_1
rtspsrc location=rtsp://10.1.0.94:554/MainStream ! rtph264depay ! h264parse ! imxvpudec ! imxipuvideotransform ! video/x-raw,width=512,height=288 ! queue ! comp.sink_2
rtspsrc location=rtsp://10.1.0.92:554/MainStream ! rtph264depay ! h264parse ! imxvpudec ! imxipuvideotransform ! video/x-raw,width=512,height=288 ! queue ! comp.sink_3
When $MIXER is set to "videomixer", my application functions correctly, however CPU usage is high. When $MIXER is set to imxg2dcompositor or imxipucompositor, the pipeline does not work in my application.
What do I need to do differently with the imx compositor than I do with the videomixer?
If I can get the imx compositor to work with h264 live video, I would expect I could have a simpler pipeline:
gst-launch-1.0 -v imxg2dcompositor name=comp \
sink_0::xpos=0 sink_0::ypos=0 sink_0::width=512 sink_0::height=288 \
sink_1::xpos=512 sink_1::ypos=0 sink_1::width=512 sink_1::height=288 \
sink_2::xpos=0 sink_2::ypos=384 sink_2::width=512 sink_2::height=288 sink_2::keep-aspect-ratio=false sink_2::rotation=2 \
sink_3::xpos=512 sink_3::ypos=384 sink_3::width=512 sink_3::height=288 sink_3::keep-aspect-ratio=false sink_3::rotation=2 \
! "video/x-raw, width=1024, height=768" ! imxeglvivsink borderless-window=true sync=false \
rtspsrc location=rtsp://10.1.0.92:554/MainStream ! rtph264depay ! h264parse ! imxvpudec ! queue ! comp.sink_0 \
rtspsrc location=rtsp://10.1.0.94:554/MainStream ! rtph264depay ! h264parse ! imxvpudec ! queue ! comp.sink_1 \
rtspsrc location=rtsp://10.1.0.92:554/SubStream ! rtph264depay ! h264parse ! imxvpudec ! queue ! comp.sink_2 \
rtspsrc location=rtsp://10.1.0.94:554/SubStream ! rtph264depay ! h264parse ! imxvpudec ! queue ! comp.sink_3
This does not work with the videomixer element though as the videomixer element won't scale.
Does anyone have any pointers on how the videomixer and imx*compositor differ, especially when it comes to the difference between an RTSP live video feed and a videotestsrc?