I need experts support. My goal is to display four camera streams simultaneously with a resolution of 640x400 each on a screen with a resolution of 1280x800. Currently I have problems to show two streams simultaneously when the streams are from cameras different types.
My Hardware:
1. Mainboard with i.MX6 quad processor and 100MBit/s Ethernet interface.
2. Ethernet camera manufacture A. H.264 RTP stream over UDP
Resolution 1: 1280x960@30fps progressive.
H.264 profile = baseline / level = 4.1 (camera always has level 4.1 independent of resolution and frame rate)
Bandwidth approximately 22MBit/s
Resolution 2: 640x400@30fps progressive.
H.264 profile = baseline / level = 4.1 (camera always has level 4.1 independent of resolution and frame rate)
Bandwidth approximately 24MBit/s
3. Ethernet camera manufacture B. H264 RTP stream over UDP
Resolution 1: 1280x720@30fps progressive.
H.264 profile = baseline / level = 3.1 (camera adjust the level in dependency of resolution and frame rate)
Resolution 2: 640x400@30fps progressive.
H.264 profile = baseline / level = 2.2 (camera adjust the level in dependency of resolution and frame rate)
Bandwidth approximately 24MBit/s
4. 100MBit Switch
General setup:
I connect two cameras and my mainboard to the switch. To display both streams simultaneously I use GStreamer:
gst-launch-1.0 -v udpsrc port=5002 ! application/x-rtp,payload=96 ! rtph264depay ! queue ! h264parse ! imxvpudec ! videocrop top=0 left=0 bottom=0 right=0 ! imxipuvideosink sync=false window-x-coord=0 window-y-coord=0 window-width=400 window-height=480 force-aspect-ratio=false &
gst-launch-1.0 -v udpsrc port=5004 ! application/x-rtp,payload=96 ! rtph264depay ! queue ! h264parse ! imxvpudec ! videocrop top=0 left=0 bottom=0 right=0 ! imxipuvideosink sync=false window-x-coord=400 window-y-coord=0 window-width=400 window-height=480 force-aspect-ratio=false &
Setup 1: Two cameras of manufacture A with resolution 1 (1280x960@30fps).
Result 1: Both streams were shown on my display without bigger latencies. Everything is fine.
Setup 2: Two cameras of manufacture B with resolution 1 (1280x720@30fps).
Result 2: Both streams were shown on my display without bigger latencies. Everything is fine.
Setup 3: Two cameras of manufacture A with resolution 2 (640x400@30fps).
Result 3: Both streams were shown on my display without bigger latencies. Everything is fine.
Setup 4: Two cameras of manufacture B with resolution 2 (640x400@30fps).
Result 4: Both streams were shown on my display without bigger latencies. Everything is fine.
Conclusion:
Two cameras from the same manufacture work well.
I also tried to use different resolutions at the same camera type. This also works fine.
Now it will be interesting.
Setup 5: One camera manufacture A and one camera manufacture B. A with resolution 1 (1280x960@30fps) and B with resolution 1 (1280x720@30fps)
Result 5: Camera of manufacture B works well but camera of manufacture A stops. Sometimes an new image was shown. I think every 5 seconds.
It is the same behaviour of camera A even when I didn’t show them simultaneously. I use GStreamer to show camera A:
gst-launch-1.0 -v udpsrc port=5002 ! application/x-rtp,payload=96 ! rtph264depay ! queue ! h264parse ! imxvpudec ! imxeglvivsink sync=false
It takes a few seconds until GStreamer bring up the pipeline. Then only every 5 seconds an new image is shown. If I disconnect camera B from the switch camera A is shown normal immediately.
When I exchange the order (show camera B with GStreamer) camera A has no influence. Camera B is shown without any limitation.
First I thought I has something to do with the resolution and the frame rate. The hardware H.264 decoder of the i.MX6 is able to decode up to 8 streams with D1@30fps but decode only 1 stream with 1080p@30fps.
But I reduced the resolution down to 640x400@30fps at both cameras and I had the same behaviour. As soon as a second stream comes in that is not equal to camera A stream I only got new images every 5 seconds from camera A.
Camera B is unimpressed and always works as I expect. I can connect two cameras of manufacture B with 1920x1080@30fps and they are show without latency and judder.
What is the reason for that behaviour?
- the fact that camera A always send baseline profile with level 4.1 and the decoder allocates too much resources.
But then why can I show two streams of camera A with 1280x960@30fps without problems?
- uses the different cameras different codec data and the decoder is not able to switch fast enough?
I exclude bandwidth problems because the bandwidth is only 50Mbit/s in summary.
Thanks
Add 16-07-2020:
I changed the GStreamer pipeline a bit to get the codec_data
gst-launch-1.0 -v udpsrc port=5002 caps="application/x-rtp" ! rtph264depay ! video/x-h264 ! h264parse ! imxvpudec ! imxipuvideosink sync=false
codec_data of camera from manufactuer A:
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:sink: caps = video/x-h264, stream-format=(string)avc, alignment=(string)au, codec_data=(buffer)01420029ffe1003c67420029e3501406642000007d00001d4c0c1bbde000400000000000000000000000000000000000000000000000000000000000000000000000000001000c68ce3c800000000000000000, level=(string)4.1, profile=(string)baseline
codec_data of camera from manufactuer B:
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = video/x-h264, stream-format=(string)avc, alignment=(string)au, codec_data=(buffer)0142e016ffe100162742e0168d680a033a6c800005dc00015f90478a115001000528ce074920, level=(string)2.2, profile=(string)constrained-baseline
I decoded the codec_data and saw that the sequenceParameterSetNALUni differs between the cameras. I don't know if this has something to do with my issue.