Simultaniously decoding of h264 streams causes judder

cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 

Simultaniously decoding of h264 streams causes judder

2,255 Views
MicMoba
Contributor V

I need experts support. My goal is to display four camera streams simultaneously with a resolution of 640x400 each on a screen with a resolution of 1280x800. Currently I have problems to show two streams simultaneously when the streams are from cameras different types.

My Hardware:

1. Mainboard with i.MX6 quad processor and 100MBit/s Ethernet interface.

2. Ethernet camera manufacture A. H.264 RTP stream over UDP

Resolution 1: 1280x960@30fps progressive.

H.264 profile = baseline / level = 4.1 (camera always has level 4.1 independent of resolution and frame rate)

Bandwidth approximately 22MBit/s

Resolution 2: 640x400@30fps progressive.

H.264 profile = baseline / level = 4.1 (camera always has level 4.1 independent of resolution and frame rate)

Bandwidth approximately 24MBit/s

3. Ethernet camera manufacture B. H264 RTP stream over UDP

Resolution 1: 1280x720@30fps progressive.

H.264 profile = baseline / level = 3.1 (camera adjust the level in dependency of resolution and frame rate)

Resolution 2: 640x400@30fps progressive.

H.264 profile = baseline / level = 2.2 (camera adjust the level in dependency of resolution and frame rate)

Bandwidth approximately 24MBit/s

4. 100MBit Switch

General setup:

I connect two cameras and my mainboard to the switch. To display both streams simultaneously I use GStreamer:

gst-launch-1.0 -v udpsrc port=5002 ! application/x-rtp,payload=96 ! rtph264depay ! queue ! h264parse ! imxvpudec ! videocrop top=0 left=0 bottom=0 right=0 ! imxipuvideosink sync=false window-x-coord=0 window-y-coord=0 window-width=400 window-height=480 force-aspect-ratio=false &

gst-launch-1.0 -v udpsrc port=5004 ! application/x-rtp,payload=96 ! rtph264depay ! queue ! h264parse ! imxvpudec ! videocrop top=0 left=0 bottom=0 right=0 ! imxipuvideosink sync=false window-x-coord=400 window-y-coord=0 window-width=400 window-height=480 force-aspect-ratio=false &

Setup 1: Two cameras of manufacture A with resolution 1 (1280x960@30fps).

Result 1: Both streams were shown on my display without bigger latencies. Everything is fine.

Setup 2: Two cameras of manufacture B with resolution 1 (1280x720@30fps).

Result 2: Both streams were shown on my display without bigger latencies. Everything is fine.

Setup 3: Two cameras of manufacture A with resolution 2 (640x400@30fps).

Result 3: Both streams were shown on my display without bigger latencies. Everything is fine.

Setup 4: Two cameras of manufacture B with resolution 2 (640x400@30fps).

Result 4: Both streams were shown on my display without bigger latencies. Everything is fine.

Conclusion:

Two cameras from the same manufacture work well.

I also tried to use different resolutions at the same camera type. This also works fine.

Now it will be interesting.

Setup 5: One camera manufacture A and one camera manufacture B. A with resolution 1 (1280x960@30fps) and B with resolution 1 (1280x720@30fps)

Result 5: Camera of manufacture B works well but camera of manufacture A stops. Sometimes an new image was shown. I think every 5 seconds.

It is the same behaviour of camera A even when I didn’t show them simultaneously. I use GStreamer to show camera A:

gst-launch-1.0 -v udpsrc port=5002 ! application/x-rtp,payload=96 ! rtph264depay ! queue ! h264parse ! imxvpudec ! imxeglvivsink sync=false

It takes a few seconds until GStreamer bring up the pipeline. Then only every 5 seconds an new image is shown. If I disconnect camera B from the switch camera A is shown normal immediately.

When I exchange the order (show camera B with GStreamer) camera A has no influence. Camera B is shown without any limitation.

First I thought I has something to do with the resolution and the frame rate. The hardware H.264 decoder of the i.MX6 is able to decode up to 8 streams with D1@30fps but decode only 1 stream with 1080p@30fps.

But I reduced the resolution down to 640x400@30fps at both cameras and I had the same behaviour. As soon as a second stream comes in that is not equal to camera A stream I only got new images every 5 seconds from camera A.

Camera B is unimpressed and always works as I expect. I can connect two cameras of manufacture B with 1920x1080@30fps and they are show without latency and judder.

What is the reason for that behaviour?

- the fact that camera A always send baseline profile with level 4.1 and the decoder allocates too much resources.

  But then why can I show two streams of camera A with 1280x960@30fps without problems?

- uses the different cameras different codec data and the decoder is not able to switch fast enough?

I exclude bandwidth problems because the bandwidth is only 50Mbit/s in summary.

Thanks

Add 16-07-2020:

I changed the GStreamer pipeline a bit to get the codec_data

gst-launch-1.0 -v udpsrc port=5002 caps="application/x-rtp" ! rtph264depay ! video/x-h264 ! h264parse ! imxvpudec ! imxipuvideosink sync=false

codec_data of camera from manufactuer A:

/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:sink: caps = video/x-h264, stream-format=(string)avc, alignment=(string)au, codec_data=(buffer)01420029ffe1003c67420029e3501406642000007d00001d4c0c1bbde000400000000000000000000000000000000000000000000000000000000000000000000000000001000c68ce3c800000000000000000, level=(string)4.1, profile=(string)baseline

codec_data of camera from manufactuer B:

/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = video/x-h264, stream-format=(string)avc, alignment=(string)au, codec_data=(buffer)0142e016ffe100162742e0168d680a033a6c800005dc00015f90478a115001000528ce074920, level=(string)2.2, profile=(string)constrained-baseline

I decoded the codec_data and saw that the sequenceParameterSetNALUni differs between the cameras. I don't know if this has something to do with my issue.

Labels (3)
0 Kudos
7 Replies

2,088 Views
Bio_TICFSL
NXP TechSupport
NXP TechSupport

Hello Micheal,

I understand it's a pipeline like:

camera(h264) -> UDP -> parser -> vpu decoder -> ipu resize -> display

 

As far as I know, ipu can handle 2 streams at maximum. When you are testing a smaller resolution, do you use the resize?

Regards

0 Kudos

2,088 Views
MicMoba
Contributor V

Hi Bio_TICFSL,

you are right with the pipeline. What do you mean with "do you use the resize"? When the ipu only can handle 2 streams how is it possible to show 3 or 4 streams simultaneously?

Meanwhile I did some more tests. Therefore I configured all cameras (now I got three of them) to 1280x800@30fps and H264 over RTP.

I noticed that a combination of camera type and bitrate makes the problems.

Furthermore I create rtpdumps of all cameras. Every rtpdump is one minute long.They only differs in the bitrate.

Cam A: 1280x800@30fps_40000kbps

Cam A: 1280x800@30fps_25000kbps

Cam A: 1280x800@30fps_10000kbps

Cam A: 1280x800@30fps_7500kbps

Cam A: 1280x800@30fps_5000kbps

The same for Cam B and C.

Now I can playback the rtpdump with rtpplay of the rtp-tools.

./rtpplay -T -f ../../../CAM_A_1280x800_30fps_40000kbps.rtpdump 192.168.0.75/5002

But the behaviour between rtpplay and a real camera stream is different.

For example:

CAM_A_1280x800_30fps_40000kbps (camera stream)

CAM_C_1280x800_30fps_25000kbps (camera stream)

Result: CAM_A stream was shown fine but CAM_C had a frozen image.

Only when I reduce the bitrate of CAM_A to 7500kbps both cameras were shown well.

BUT when I stream CAM_C by rtpplay both cameras were shown well.

CAM_A_1280x800_30fps_40000kbps (camera steam)

CAM_C_1280x800_30fps_25000kbps (rtplay)

What can be the difference between the real camera stream and the rtpplay stream?

I tested several combinations of camera types and bitrates and I noticed that only one camera causes problems. Every time when I got judder I reduce the bitrate of this camera equal or less 7500kbps and the streams were shown well.

And every time this camera were shown without judder. Only the other camera judder or stop.

Is there a secret priorisation in UDP/RTP that I don't know about?

Here my GStreamer Pipeline for testing:

gst-launch-1.0 -v udpsrc port=5002 ! application/x-rtp,payload=96 ! rtph264depay ! queue ! h264parse ! imxvpudec ! imxipuvideosink sync=false window-x-coord=400 window-y-coord=0 window-width=400 window-height=480 force-aspect-ratio=false & gst-launch-1.0 -v udpsrc port=10002 ! application/x-rtp,payload=96 ! rtph264depay ! queue ! h264parse ! imxvpudec ! imxipuvideosink sync=false window-x-coord=0 window-y-coord=0 window-width=400 window-height=480 force-aspect-ratio=false

0 Kudos

2,088 Views
Bio_TICFSL
NXP TechSupport
NXP TechSupport

What is the kernel version you are using?

 

Can they display 4 "gst-launch-1.0 videotestsrc" pipelines in their screen?

 

Regards

0 Kudos

2,088 Views
MicMoba
Contributor V

I noticed another thing.

For that test I didn't show two or three streams simultaniously, I only show one stream.

CAM_A 1280x800@30fps_25000kbps -> UDP port 8884

CAM_B 1280x800@30fps_40000kbps -> UDP port 10002

I startet GStreamer:

gst-launch-1.0 -v udpsrc port=8884 ! application/x-rtp,payload=96 ! rtph264depay ! queue ! h264parse ! imxvpudec ! imxipuvideosink sync=false

Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = application/x-rtp, payload=(int)96, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264
/GstPipeline:pipeline0/GstRtpH264Depay:rtph264depay0.GstPad:src: caps = video/x-h264, stream-format=(string)byte-stream, alignment=(string)au
/GstPipeline:pipeline0/GstQueue:queue0.GstPad:src: caps = video/x-h264, stream-format=(string)byte-stream, alignment=(string)au
/GstPipeline:pipeline0/GstQueue:queue0.GstPad:src: caps = video/x-h264, stream-format=(string)byte-stream, alignment=(string)au
/GstPipeline:pipeline0/GstH264Parse:h264parse0.GstPad:src: caps = video/x-h264, stream-format=(string)byte-stream, alignment=(string)au, parsed=(boolean)true
[INFO]  bitstreamMode 1, chromaInterleave 0, mapType 0, tiled2LinearEnable 0
/GstPipeline:pipeline0/GstImxVpuDecoder:imxvpudecoder0.GstPad:sink: caps = video/x-h264, stream-format=(string)byte-stream, alignment=(string)au, parsed=(boolean)true
/GstPipeline:pipeline0/GstH264Parse:h264parse0.GstPad:sink: caps = video/x-h264, stream-format=(string)byte-stream, alignment=(string)au
/GstPipeline:pipeline0/GstRtpH264Depay:rtph264depay0.GstPad:sink: caps = application/x-rtp, payload=(int)96, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264

The serial output stops. No output on my screen.

After I decrease the bitrate of CAM_B to 7500kbps the following output comes over serial line and the stream of CAM_A is shown on my screen.

/GstPipeline:pipeline0/GstH264Parse:h264parse0.GstPad:src: caps = video/x-h264, stream-format=(string)byte-stream, alignment=(string)au, width=(int)1280, height=(int)800, framerate=(fraction)30/1, interlace-mode=(string)progressive, chroma-format=(string)4:2:0, bit-depth-luma=(uint)8, bit-depth-chroma=(uint)8, parsed=(boolean)true, profile=(string)constrained-baseline, level=(string)5
[INFO]  bitstreamMode 1, chromaInterleave 0, mapType 0, tiled2LinearEnable 0
/GstPipeline:pipeline0/GstImxVpuDecoder:imxvpudecoder0.GstPad:sink: caps = video/x-h264, stream-format=(string)byte-stream, alignment=(string)au, width=(int)1280, height=(int)800, framerate=(fraction)30/1, interlace-mode=(string)progressive, chroma-format=(string)4:2:0, bit-depth-luma=(uint)8, bit-depth-chroma=(uint)8, parsed=(boolean)true, profile=(string)constrained-baseline, level=(string)5
/GstPipeline:pipeline0/GstImxVpuDecoder:imxvpudecoder0.GstPad:src: caps = video/x-raw, format=(string)I420, width=(int)1280, height=(int)800, interlace-mode=(string)progressive, multiview-mode=(string)mono, multiview-flags=(GstVideoMultiviewFlagsSet)0:ffffffff:/right-view-first/left-flipped/left-flopped/right-flipped/right-flopped/half-aspect/mixed-mono, pixel-aspect-ratio=(fraction)1/1, chroma-site=(string)mpeg2, colorimetry=(string)bt709, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstImxIpuVideoSink:imxipuvideosink0.GstPad:sink: caps = video/x-raw, format=(string)I420, width=(int)1280, height=(int)800, interlace-mode=(string)progressive, multiview-mode=(string)mono, multiview-flags=(GstVideoMultiviewFlagsSet)0:ffffffff:/right-view-first/left-flipped/left-flopped/right-flipped/right-flopped/half-aspect/mixed-mono, pixel-aspect-ratio=(fraction)1/1, chroma-site=(string)mpeg2, colorimetry=(string)bt709, framerate=(fraction)30/1

My suspicion is that it maybe has something to do with the network stack instead of VPU or IPU. Is this possible?

0 Kudos

2,088 Views
MicMoba
Contributor V

I am using Kernel 4.4.126.

And yes it is possible to display 4 "gst-launch-1.0 videotestsrc" pipelines on my screen.

gst-launch-1.0 videotestsrc ! imxipuvideosink window-x-coord=0 window-y-coord=0 window-width=400 window-height=240 force-aspect-ratio=false & gst-launch-1.0 videotestsrc ! imxipuvideosink window-x-coord=0 window-y-coord=240 window-width=400 window-height=240 force-aspect-ratio=false & gst-launch-1.0 videotestsrc ! imxipuvideosink window-x-coord=400 window-y-coord=0 window-width=400 window-height=240 force-aspect-ratio=false & gst-launch-1.0 videotestsrc ! imxipuvideosink window-x-coord=400 window-y-coord=240 window-width=400 window-height=240 force-aspect-ratio=false

0 Kudos

2,088 Views
Bio_TICFSL
NXP TechSupport
NXP TechSupport

You are playing with non-supported kernel please update your linux.

0 Kudos

2,088 Views
MicMoba
Contributor V

What do you mean with non-supported kernel? What is the current version that is supported?

0 Kudos