We use PCIe to connect Intersil TW6865 chip for the surround view solution.
This is the connection of PCIe to iMX6Q SabreSD board.
This is the block diagram of the connection:
This is the 4 camera surround view:
Code base is L3.0.35_12.10.02 release. You can merge the patch file to the latest Freescale release.
Please check the attach file for the patch code.
Note: It is only a test version.
The last code for L3.0.35 BSP: L3.0.35_GA4.1.0 Patches.7z
The last code for L3.10.53 BSP: L3.10.53_TW686x_patch.7z
Patch for L4.1.15 1.1.0 GA BSP: TW6865 driver for Linux L4.1.15_1.1.0-ga.7z
Hi, is your intended video device enumerated ? i.e. what v4l2-ctl -d /dev/videox --all give you ?
and try exactly this :
gst-launch-1.0 -v v4l2src device=/dev/video0 ! video/x-raw,width=720,height=480,pixelformat=UYVY ! imxipuvideotransform ! imxeglvivsink
Note, for me pci MSI had to be disabled in order to get video frames !!!
Thanks!
Now it's working with " gst-launch-1.0 v4l2src device=/dev/video2 ! imxv4l2sink overlay-width=512 overlay-height=300 overlay-top=300 overlay-left=0 "
I don't know why imxv4l2src doesn't work.
"v4l2-ctl -d /dev/videox --all" outputs is here:
Driver Info (not using libv4l2):
Driver name : tw686x
Card type : tw6869
Bus info : PCI:0000:01:00.0
Driver version: 4.1.15
Capabilities : 0x85200001
Video Capture
Read/Write
Streaming
Extended Pix Format
Device Capabilities
Device Caps : 0x05200001
Video Capture
Read/Write
Streaming
Extended Pix Format
Priority: 2
Video input : 0 (Composite0: ok)
Video Standard = 0x0000b000
NTSC-M/M-JP/M-KR
Format Video Capture:
Width/Height : 360/240
Pixel Format : 'YUYV'
Field : Interlaced
Bytes per Line: 720
Size Image : 172800
Colorspace : Broadcast NTSC/PAL (SMPTE170M/ITU601)
Flags :
Streaming Parameters Video Capture:
Capabilities : timeperframe
Frames per second: 30.000 (30/1)
Read buffers : 3
User Controls
brightness (int) : min=-128 max=127 step=1 default=0 value=0 flags=slider
contrast (int) : min=0 max=255 step=1 default=100 value=100 flags=slider
saturation (int) : min=0 max=255 step=1 default=128 value=128 flags=slider
hue (int) : min=-128 max=127 step=1 default=0 value=0 flags=slider
brightness (int) : min=-128 max=127 step=1 default=0 value=0 flags=slider
contrast (int) : min=0 max=255 step=1 default=100 value=100 flags=slider
saturation (int) : min=0 max=255 step=1 default=128 value=128 flags=slider
hue (int) : min=-128 max=127 step=1 default=0 value=0 flags=slider
If I use imxv4l2src, it will say "could not link imxv4l2src0 to imxv4l2sink0".
Hi, now can you get the video stream without any latency as expected ? Do you have a git link to the driver you use with sabresd ? Or are you using a different board ?
Anuradha
1. For now, I'm using only one camera, if I run two or more camera there will be some error message:
ERROR: from element /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: Internal data flow error.
Additional debug info:
../../../../gstreamer-1.8.1/libs/gst/base/gstbasesrc.c(2948): gst_base_src_loop (): /GstPipeline:pipeline0/GstV4l2Src:v4l2src0:
streaming task paused, reason error (-5)
command is below:
gst-launch-1.0 v4l2src device=/dev/video0 ! imxv4l2sink overlay-width=512 overlay-height=300 overlay-top=0 overlay-left=0 &
gst-launch-1.0 v4l2src device=/dev/video1 ! imxv4l2sink overlay-width=512 overlay-height=300 overlay-top=0 overlay-left=512 &
For one camera, there is no latency.
2. I use the driver you provided is another post, linux-imx6/drivers/media/pci/tw686x at boundary-imx_4.1.15_2.0.0_ga · boundarydevices/linux-imx6 · G... , I also try this one GitHub - sasamy/tw6869: tw6869/65 media bridge driver , same result.
I'm using a customed board.
I remember I had 'unexpected p-b buffer' issue while running the camera, but I was experimenting this on one of the Boundary Devices boards. Which board and distro are you using ? Can you also show the current memory map in kernel log?
i.e :-
BR
Anuradha
My board is not belong to boundary boards.
yocto distro: "4.1.15-2.0.1", "fsl-imx-x11"
memory map:
Memory: 698176K/1048576K available (8364K kernel code, 432K rwdata, 3000K rodata, 432K init, 450K bss, 22720K reserved, 327680K cma-reserved, 0K highmem)
[ 0.000000] Virtual kernel memory layout:
vector : 0xffff0000 - 0xffff1000 ( 4 kB)
fixmap : 0xffc00000 - 0xfff00000 (3072 kB)
vmalloc : 0xc0800000 - 0xff000000 (1000 MB)
lowmem : 0x80000000 - 0xc0000000 (1024 MB)
pkmap : 0x7fe00000 - 0x80000000 ( 2 MB)
modules : 0x7f000000 - 0x7fe00000 ( 14 MB)
.text : 0x80008000 - 0x80b215e0 (11366 kB)
.init : 0x80b22000 - 0x80b8e000 ( 432 kB)
.data : 0x80b8e000 - 0x80bfa160 ( 433 kB)
.bss : 0x80bfd000 - 0x80c6d8c0 ( 451 kB)
I use the official kernel and use the boundary kernel's tw686x driver, I have also edited the quirk.c file.
Hi, have you got the issue resolved ? I can't see any issue in your logs. I also have the imxv4l2src video issue. I cannot even run a single camera consistently, the system freezes at a point and I am receiving 'unexpected p-b buffer issue' from the driver. I used Ubuntu for this. If there are any updates, please share with me !
I have not solved the imxv4l2src video issue.
The first thing is to make sure weather it is kernel's issue or gstreamer. Can you try nfs root? If so, it will be much convenient to try this rootfs:
rootfs.tar.bz2 - Google Drive , note it's for my linux 3.0.5, but it's also working for my linux 4.1.15 with device tree.
And then use
gst-launch-0.10 v4l2src device=/dev/video0 ! 'video/x-raw-yuv, format=(fourcc)UYVY' ! mfw_isink disp-width=512 disp-height=300 axis-left=0 axis-top=0
test it.
Btw how can I add mfw_isink ? I have no such element in gst-inspect !
The command above only suitable to the gst-launch-0.10 in the rootfs I provide.
Hi Lei Ma,
Have you got the camera issue resolved with boundary devices driver ? I still have this latency and freezing issue. I am strictly following the data sheet schematic of tw6865 chip !
Thanks
Sorry, I'm not familiar with linux device driver, for now I'm focus on image process with opengl es and opencv.
Boundary devices driver has two drivers for tw6865, one is tw6869 from GitHub - sasamy/tw6869: tw6869/65 media bridge driver another is tw686x from upstream kernel, I use the upstream one.
Hi, that's right. I am referring to the upstream kernel driver. Since you got it working without any problem I was wondering whether you have made a change to HW or Driver. In my case, for the upstream kernel driver, the video starts and frame rate becomes very low and then system hangs. I also see the 'unexpected p-b buffer' error in dmesg. Did you have to disable audio support or anything like that ? I had to disable pcie interrupt as well to get it work.... If you don't mind can you kindly share your linux .conf file ? I want to see kernel configurations under pcie.
Thanks
Hi,
Here is my kernel config file, .config - Google Drive , I also disabled pcie interrupt.
Thanks very much Lei, this is very helpful. Btw do you see 'unexpected p-b buffer' error appears in dmesg when running the camera ?
Anuradha
No, there is no such error in my device.
Hi, thanks alot for the prompt repose. That confirms one thing to me, there is a problem in the driver or my hardware !
Anuradha
Hi, one last thing to confirm Lei, are you aware of following patch for dma allocation ?
https://git.congatec.com/arm/qmx6_kernel/commit/f4857b5482c67dcc9757d50cf57cb32728af788b?w=1
https://community.nxp.com/thread/304368
If you don't mind, can you kindly check or share videobuf-dma-contig.c or mxc_vpu.c files of your running kernel ? I saw in several posts, these changes are required. If you're not aware, I'd like to point this for your attention too.....
Thanks
I don't know that, my kernel version is 4.1.15, and these two files is bit different.
Hi Lei Ma,
I got to know from the original author of the driver that, imxv4l2videosrc only works for csi cameras. Not with this pci stuffs. So correct source is v4l2src !
Btw is your single camera working well for gst-launch-1.0 with imxv4l2sink as well ? Without any latency ? can you also show me your pci/quirks.c file ?
Kindly confirm me it .........
Anuradha
Sorry I forget this. Is "latency" means delay? If is there is no latency with my camera.
Dear, Liqiang
We are using your patch( TW6865 driver for Linux L4.1.15_1.1.0-ga.7z)and trying to develop 4 camera surroundview application base on imx6d chip and PCIE\TW68x\ CVBS camera , but meet a problem!
The application always freeze at the following codes:
--------------
if (ioctl(m_ifd, IPU_CHECK_TASK, &m_sTask) != IPU_CHECK_OK) {
printf("IPU_CHECK_TASK failed.\r\n");
return -1;
}
-------------
Do you know the reason? thanks a lot!
我们的目标是想做到程序的zerocopy,但从现象上看好像是在做VDI的时候程序freeze了。
补充:
我们分配内存使用的接口是glTexDirectVIV, 而没有使用例子代码中的g2d_alloc;
是这个原因导致IPU在做VDI失败吗?
我们确认下来,只有使用g2d_alloc分配内存才能正常的进行VDI;如果使用malloc分配内存,程序会freeze!
是不是可以理解成这个patch( TW6865 driver for Linux L4.1.15_1.1.0-ga.7)中只支持VDI到g2d_alloc的空间,不支持VDI到malloc空间?
如果需要支持到malloc,应该怎么处理?谢谢!~~~
IPU task's input.paddr and output.paddr should be physical address of the input buffer and output buffer, if you use wrong address, it will fail to work.
谢谢你的回答!
1.)VDI到g2d,
2.)再通过gl映射内存。
硬件DMA操作需要使用物理连续内存,malloc分配的内存不符合要求。
Hi Qiang Li:
I think you need check the delay for each processing step, before next processing, maybe the pre-step is not finished.
Hi,liqiang
>硬件DMA操作需要使用物理连续内存,malloc分配的内存不符合要求。
麻烦确认一下:
我这样应用在那里用到了DMA?
你的意思是说g2d_alloc 分配的不是连续物理内存吗?g2d_alloc 是如何分配内存的?
谢谢!
g2d_allo分配的就是物理连续内存,而malloc分配的就不是。
camera抓取图像,IPU task操作和G2D操作的源buffer和目标buffer都要求是物理连续内存。
了解了,谢谢!
顺便再问下:
glTexDirectVIVMap映射的内存,必须是物理连续的吗?谢谢!
另外,glTexDirectVIV 分配的内存是物理连续内存吗?谢谢
glTexDirectVIVMap()也可以得到物理连续内存,它的最后一个参数Physical可以用作IPU task操作和G2D操作的目标buffer。
According to what you mean, is the GC2000 and GC320 taking up the GPU in sequence? After the GC2000 has finished performing the on-screen rendering, can the GC320 take up GPU resources for 4-way image synthesis? Or first GC320 execution, then GC2000 re-execution? For different data buffers, can the two modules GC2000 and GC320 not run at the same time?
GC2000 and GC320 can run at the same time.
Hi,liqiang
我使用IMX6Q平台,现在有四路视频需要分别保存每个通道的视频数据,四路视频流是怎么进入VPU中进行编码的?单路的编码比较好实现,
4路视频需要串行的经过VPU,这个逻辑我实现起来毫无头绪,可否提供类似的资料链接或demo啊?
谢谢
4路编码和1路编是一样的,只要开4个任务就好了,但是VPU硬件不能支持4路720P编码
VPU可以同时编码分辨率时多少的4路视频?4个任务 同时使用VPU进行编码,不会有冲突吗?都会去抢占VPU的资源吧? 因为我使用IPU的时候可以创建4个任务,2个IPU,每个IPU内核中创建2个ipu_task,但是VPU也是有类似IPU的机制吗?不是很明确VPU的机制
VPU的软件架构支持多任务,当然,实际在VPU硬件上编码的时候,还是排队进行的。
我使用gstreamer进行4路视频编码为h264格式,使用让这个命令:srcapp->imxvideoconvert_ipu->vpuenc_h264->filesink, 我直接创建4 个pipeline就可以4路编码吗?系统会再VPU编码时自动处理每一路的编码顺序?