I am having an issue getting video to properly display via webRTC and the problem seems to be the h264 encoding done by imxvpuenc_h264. My setup is I am using gstreamer to stream RTP to a UDP sink and then using Janus Gateway to do the webRTC that can be viewed by the user when the connect to a webpage running on the device.
Now what happens is if i do this:
gst-launch-1.0 videotestsrc is-live=true ! openh264enc ! rtph264pay ! capssetter caps="application/x-rtp,profile-level-id=(string)42e01f" ! udpsink host=127.0.0.1 port=8080 &
Everything works perfectly! The video displays on the page with no issues, but as you can see i'm using the openh264enc and doing the encoding via software. If I swap out the encoder and use the imx's gstreamer hardware encoder like this:
gst-launch-1.0 videotestsrc is-live=true ! imxvpuenc_h264 ! rtph264pay ! capssetter caps="application/x-rtp,profile-level-id=(string)42e01f" | udpsink host=127.0.0.1 port=8080 &
Things don't work as well. It partially works, so if i start the pipeline above AFTER i open the webpage on my client the video shows and plays fine, BUT if the pipeline has been started before I open the page or i have a short network blip the video doesn't play it just freezes at the current frame (or no frame if page loaded after stream is started). It's like there is some data sent at the very start of the video or something that if i miss it, it won't play. I tried doing something like this:
gst-launch-1.0 videotestsrc is-live=true ! imxvpuenc_h264 ! fdsink | ffmpeg -f h264 -i - -c:v copy -flags global_header -bsf dump_extra -f rtp rtp://127.0.0.1:8080 &
To have ffmpeg put the headers in but that didn't make any difference.
Any thoughts about what the issue with the video encoded by the hardware encoder might be? Or if there is a better place to ask this question? Since i can fix the issue by simply not using the hardware encoder it would seem to be a problem with that, so this seemed like the best place to start.