Hi,
We are facing a problem with the quality of h.264 encoding using the imx6q vpu. The video is not clear and there are visible blurring lines on horizontal edges .
We have used following pipeline for h.264 rtp streaming.
1. imx6 board (gstreamer version 0.10)
gst-launch tvsrc ! vpuenc codec=6 ! rtph264pay ! udpsink host=192.168.0.135 port=5000
2. on PC
gst-launch udpsrc port=5000 caps="application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, sprop-parameter-sets=(string)\"Z0JAHqaAtD2QAA\\=\\=\\,aM48gAA\\=\", payload=(int)96, ssrc=(uint)1161490911, clock-base=(uint)2476996736, seqnum-base=(uint)7152" ! rtph264depay ! ffdec_h264 ! xvimagesink
Attached is a snapshot of the video saved using the pipeline
gst-launch tvsrc ! vpuenc ! matroskamux ! filesink location=output.mkv sync=false
The video is of very low quality(you can see the fan and cupboard borders are not sharp) and exactly same phenomenon is observed while h.264 rtp streaming .
We are stuck here. Has anybody encountered this problem before ? If so,how to solve it ? We need this solved urgently. Thank you.
Solved! Go to Solution.
The problem is solved.
We used this pipeline for display:
gst-launch-1.0 imxv4l2src ! deinterlace mode=1 ! autovideosink
This pipeline to save to file
gst-launch-1.0 imxv4l2src num-buffers=100 ! queue ! deinterlace mode=1 ! imxvpuenc_h264 ! filesink location=out.mkv sync=false
And this to stream
board : gst-launch-1.0 imxv4l2src ! queue ! deinterlace mode=1 ! imxvpuenc_h264 ! rtph264pay ! udpsink host=192.168.0.135 port=5000
PC: H264.sdp file.
I've attached the out.mkv and H264.sdp file . Now there are no lines and the picture is very clear.
What version of BSP are you using?, which board are you using?
We are using yocto daisy branch . We have made a custom board based on sabrelite which uses tv decoder adv7180 as parallel camera input.
In addition to jamesbone's questions, please, let me know if you are considering that tvsrc output video frame is interlaced.
yes we assume the tvsrc output video is interlaced . We used v4l2 ioctls to capture data from adv7180 decoder (set fmt to V4L2_FIELD_INTERLACED) and we could get clear picture. The problem we are facing is in quality of capturing and encoding with gstreamer. Maybe we are missing something in the gstreamer pipeline which indicates whether interlaced or non-interlaced.
the picture you shared looks like the same line is being displayed twice. That's why I asked about interlacing.
Regarding quality. There are several aspects related with quality. When you say quality, what do you mean?
daisy -> there are several "daisy branches". Freescale has released at least 2 releases based on daisy. However, if you are using community source code, and you are still on daisy, I really recomment you to upgrade to fido, because daisy is one year(+) late.
I dont understand where interlacing comes into play in this gstreamer pipeline .The same pipeline has been used by a lot of developers. By quality i mean the lines that are visible in the image , the image is not sharp or similar to original image. What we want is something closer to the original video that is displayed using the pipeline gst-launch tvsrc ! mfw_v4lsink
Can you, please generate a movie file with the same bad quality you've been getting with streaming? Something like:
imxv4l2src -> vpuenc -> muxer -> filesink
Can you, please, use gstreamer 1.0 to test it?
Once you have created the movie, please attach it here
1.) For USB-webcam
I did try with a usb webcam and i did not get any lines in the video .
the caps for the webcam were as follows, caps= video/x-raw-yuv, format=(fourcc)YUY2, width= (int)640 , height=(int)480 , interlaced=(boolean)false pixel-aspect-ratio =(fraction)1/1 , framerate =(fraction)30/1
the interlaced parameter is doing the difference as you said .
2.) For parallel interface with ADV7180 decoder
I compiled gstreamer-1.0 and worked out the pipeline gst-launch-1.0 imxv4l2src ! autovideosink ,
The lines are more prominent than gst-launch-0.1 tvsrc ! mfw_v4lsink
I will add the clips soon.
Is there any solution to this problem. I know i am doing something wrong with the pipeline, but i cant find out where.
How do i take care of the interlaced parameter in the pipeline.?
I have never really worked with ADV7180 decoder with gstreamer, so I don't previously know which maybe the parameters to change.
What are the BSP version you are using?
From i.MX Linux® User's Guide , Rev. L3.14.28_1.0.0-ga
7.3.8 Recording the TV-in source
The TV-in source plugin gets video frames from the TV decoder. It is based on the V4l2 capture interface. A command line example is follows:
For gstreamer 1.x use:
gst-launch-1.0 imxv4l2src ! imxv4l2sink
For Gstreamer 010 use:
gst-launch tvsrc ! imxv4l2sink
$GSTL tvsrc num-buffers=100 ! vpuenc ! matroskamux ! filesink location=./output.mkv sync=false
NOTE
The TV decoder is ADV7180. It supports NTSC and PAL TV mode. The output video frame is interlaced, so the sink plugin needs to enable deinterlace. The default value of imxv4l2sink deinterface is True.
The problem is solved.
We used this pipeline for display:
gst-launch-1.0 imxv4l2src ! deinterlace mode=1 ! autovideosink
This pipeline to save to file
gst-launch-1.0 imxv4l2src num-buffers=100 ! queue ! deinterlace mode=1 ! imxvpuenc_h264 ! filesink location=out.mkv sync=false
And this to stream
board : gst-launch-1.0 imxv4l2src ! queue ! deinterlace mode=1 ! imxvpuenc_h264 ! rtph264pay ! udpsink host=192.168.0.135 port=5000
PC: H264.sdp file.
I've attached the out.mkv and H264.sdp file . Now there are no lines and the picture is very clear.
Hi, would you please show me the command on PC? Thanks!
Hi we haven't used an interlaced camera with a PC and i even doubt that any of the usb cameras available give interlaced output. So, these pipelines cannot be tested with PC. If you want to use a usb webcam you need to use this pipeline for display :
gst-launch v4l2src device=/dev/video0 ! autovideosink
Hi, thanks for your explanation. Actually, in my case, the camera is embedded on the iMX6 board and I can use the command gst-launch-1.0 imxv4l2src device=/dev/video0 ! vpnenc ! rtph264pay ! udpsink host=192.168.8.101 port=5000 sync=false to build the pipeline. And I want to use the android mobile phone as the receiver to play the video alive. However, after many attemps, I hava not succeeded still. I wonder if you have tried iMX6 board-to-android mobile phone video transmission ? And if so ,would you please show me how to achieve this. Thanks very much!