Its simple in our application. The overlay QImage is created asynchronously about once per second (Actually two QImages, one being drawn, one for display with a Mutex for switching between).
The gstreamer appsrc element is configured to emit video frames at 5 fps and takes the QImage and injects it into the gstreamer stream via the imxcompositor_g2d.
ogstOverlaysrc=gst_bin_get_by_name(GST_BIN(ogstPipeline), "appsrc");
if(ogstOverlaySrc){
g_object_set(G_OBJECT(ogstOverlaySrc), "caps", gst_caps_new_simple("video/x-raw",
"format", G_TYPE_STRING, "BGRA", "width", G_TYPE_INT, osize.width(),
"height", G_TYPE_INT, osize.height(),
"framerate", GST_TYPE_FRACTION, 5, 1,
NULL), NULL);
g_object_set(G_OBJECT(ogstOverlaySrc), "stream-type", 0, // GST_APP_STREAM_TYPE_STREAM
"format", GST_FORMAT_TIME, "is-live", FALSE, NULL);
g_signal_connect(ogstOverlaySrc, "need-data", G_CALLBACK(gstOverlayAddFrameCallback), gpointer(this));
}
Pipeline bits:
imxcompositor_g2d name=c latency=20000000 sink_1::alpha=1.0 ! identity drop-allocation=true
v4l2src device=/dev/video3 ! video/x-raw,width=1920,height=1080,framerate=25/1 ! c.sink_0
appsrc name="appsrc" ! videoconvert ! video/x-raw,format=ARGB ! c.sink_1