USING MEMORY interface instead of CSI

取消
显示结果 
显示  仅  | 搜索替代 
您的意思是: 

USING MEMORY interface instead of CSI

4,046 次查看
VSPA
Contributor I

We have a yuv420 file for which  frames need to be resized & encode to MP4 format.

Currently in the Linux v4l driver prp module is attached to CSI interface with a dedicated CSI bus.

As per the refernce manual section 41.1.2 the CSI dedicated bus can be detached & prp module can be attached to AHB memory bus.

How can we attach the prp module to AHB memory bus? Anybody has tried this ?

Any other sugeestion to acheive yuv420 file resizing using imx27 VPU

 

 

标签 (1)
标记 (1)
0 项奖励
回复
9 回复数

2,788 次查看
akio
Contributor I

for input yuv420 raw data to prp, I used hack this via V4L2 prp mmap.

between QBUF and DQBUF, there is a chance to replace the camera data

with your own yuv420 data. and then the encoder will take this as source

, encode, finaly the video bit stream is come out.

 

if you still have no idea about how to do this, just trace the mxc vpu unit test

cod, in capture.c, loopback.c, and you will know what I'm talking about.

 

if you want to skip the memcpy, just use the mmap virtual address as file 

handle buffer.

 

hope this can help you.

 

Best regards,

Akio

 

0 项奖励
回复

2,788 次查看
VSPA
Contributor I
Thanks a lot for your reply
Actually our requirement is like below,
 
We have to downscale a yuv file with resolution 640x480 or more to 320x240 and encode to mp4 and send through wireless network.
As we understand CSI is a dedicated Bus  to the PRP module. If we try to put our own YUV420 data instead of Camera data( still not clear how exactly we can use memory instead of CSI, because the CSI is directly putting to Prp & RX FIFO /DMA is not under Software control), 
Also if we try to use as above, how to give details about the  frame like Start of  Frame, End of Frame, Change of Field, FIFO full .There is a single  interrupt source to the interrupt controller from maskable sensor interrupt sources from the CSI. How  can we simulate this for Raw YUV420
Also for CSI, channel 1 is used to get display output but we have to use channel 2 as an input to the mpeg4 encoder.
Hence we were thinking if we can use AHB memory interface instead of CSI  as per the Refernce manual 41.1.2 is this possible, but nowehere it is clear on how we can use AHB memory interface to Prp.
0 项奖励
回复

2,788 次查看
akio
Contributor I

for what you want is the driver effort not only hw design. as I know, what I use is for replace the raw data after cmos capture done. this is no change to hw, driver(cmos, prp, etc). If you want to try this, I think you have to trace the cmos sensor to see how to link the csi interface and cmos. prp driver, for link between prp, csi. And finaly, make your own driver to use prp directly, one input structure for raw data input, one output for encoder, one output to viewfinder as preview on framebuffer, which shown on LCD screen. the quickest way is try what I said, just replace the cmos raw data with what you want, and see the outcome is good or not.

 

I think for using prp directly, just modified the linux driver is enough. this is my idea. hope this can help you to find out how to do.

 

Best Regards,

Akio

0 项奖励
回复

2,788 次查看
VSPA
Contributor I
AKIO >for what you want is the driver effort not only hw design.
VSPA>We don't think we need to do any HW design , as it is mentioned in the Manual that either CSI or AHB memory can be used( section 41.1.2) . We need to know how we can use the AHB memory instead of CSI

AKIO >I think for using prp directly, just modified the linux driver is enough.
VSPA> Can you pl let us know what exactly we need to modify in the linux driver?
0 项奖励
回复

2,788 次查看
akio
Contributor I

i think you have to trace the cmos sensor, mx27 emma_lt and related stuff in linux kernel source tree in drivers/media/video/mxc, the are three directory. and the related platform registre setting in linux include header directory. the freescale encapuslates the emma_lt 's prp function as V4L2 interface, the emma_lt's pp funtion is the same. what you have to take is the emma_lt channel init, setting, format setting, scaling setting, data input / output setting, etc. the next step is modify the driver or create a new driver to operate the emma_lt as what you want. hope this can help you.

 

for the hw side, because I'm not a hw guide. It is maybe or maybe not for hw modification. btw, I suggest you try the replace raw data from cmos capture first in mxc vpu unit test code. it is very easy to do the test. to see the output of encoder.

 

Best Regards,

Akio

 

0 项奖励
回复

2,788 次查看
VSPA
Contributor I

Using the mxc_vpu test application we will be able to replace the raw-data and give to encoder. But problem here is we have to scale the YUV data to 320x240 and give. Test application gives YUV scaled data from the camera by setting the width and height. ie the yuv buffer is already scaled from the PRP module when we query from the test application.

Here we are using mxc_v4l2 test application (same which integrated with mxc_vpu) to capture the data from the camera. Using this we are able to get scaled data from the camera by setting the output width and height. But the frame which we are getting from the v4l2 application uses IOCTL setting to get the scaled buffer  from the camera, I am not able to find an easy way to replace this buffer with the YUV frame (from yuv file640x480 ) to  scale 320x240 size.

ie. the buffer we are getting from the v4l2 test application are scaled one,

Could you please give me more idea where I can hack easily to get a solution.

0 项奖励
回复

2,788 次查看
akio
Contributor I

the way what i think is about the modification to export the prp memory buffer related info to user space to let user to pass the yuv data into prp and get the output to encoder or shown on lcd. as I said, you have to hack the prp module in drivers/media/video/mxc/capture and related staff.

 

Best Regards,

Akio

 

0 项奖励
回复

2,788 次查看
NK
Contributor I

Dear Akiyo,

 

I am also trying to resize yuv file using PRP module, but no success till now.

 

Could you please give us some solution or code which help us to achieve this fast.

 

Thanks

Nimesh

 

 

0 项奖励
回复

2,788 次查看
akio
Contributor I

please refer to the following kernel module in i.MX27 linux kernel source.

drivers/media/video/mxc/capture/*.h

 

1. please analysis the Kconfig and Makefile to know what is the realy part which is used in mx27.

2. mx27_prphw.c mx27_prpsw.c mx27_v4l2_capture.c mx27_csi.c mx27_capture.c ov2640_camera.c

    will be the way which from mx27 csi, mx27 prp, emma_lt related data link.

3. please trace the above file about how the cmos data send into mx27's emma_lt prp.

4. make your own kernel module to export proper field to user space for yuv data input into emma_lt

    and get the output data, send into vpu or where you want to do about the processed data.

 

hope this can help you to do what you want.

 

Best Regards,

Akio

0 项奖励
回复