iMX6Q: Empty buffers capturing sensor data (Aptina MT9V126)

cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 

iMX6Q: Empty buffers capturing sensor data (Aptina MT9V126)

799 Views
dawoo
Contributor I

Hi All,

I am currently trying to grab images from a Aptina MT9V126 sensor on iMX6 Q. Kernel is based on

Freescales Linux-3.0.35-fsl-4.1. The sensor driver is based on a driver provided by Aptina for the

MT9V129 and is adapted to our needs.

Sensor output format:

     640 x 480 framesize

     YCbCr progessive

     30 Hz framerate

After initialization the driver offers a sensor device on /dev/video0. The sensor is correctly configured

via I²C: sensor streams with correct rates of line- and framesync (I can see a framerate of 30Hz on

the oscilloscope). Data pins are also showing a plausible signal.

I do have a user space application that accesses /dev/video0 and uses the V4L2-Api to access

sensor data. Setting up format, querying and queueing buffers works fine. After start of capture and

when polling the sensor filedescriptor, poll function regularily returns at a rate of 30 times per second

(which matches framerate). I can also dequeue and requeue the buffers.

My problem is: the buffers that return from the V4L layer are not filled with sensor data. All bytes are

zero or contain data I have written into the buffers before queing them.

Any suggestions?

Thanks!

0 Kudos
3 Replies

540 Views
jamesbone
NXP TechSupport
NXP TechSupport

.

When you using mxc_v4l2_capture.out to test the camera function, it will set capture format to the sensor driver, then sensor driver will set the camera sensor by I2C to output the correct data. You can add debug messages in you camera sensor driver to know which format was set from application layer, then you can also measure the pixcel clock to check if it is correct output from camera sensor.

For example, if application tells camera sensor ouput 640*480 UYVY data in 15fps, then with 8 bits data bus, the pixcel clock is 640*480*2*15, you can measure the pixcel clock to make sure the sensor had output correct data.

By the way, if your are reworking the SabreSD board to connect with your sensor, maybe there is noise on CSI data line which will caused wrong capture.

0 Kudos

540 Views
dawoo
Contributor I

Hi Guys,

thanks for the input. I managed to capture sensor data after all. The solution was quite simple:

In my user space application I tried to feed userpointers to the V4L implementation. I guess since the memory I provided was not physically contiguous,

the DMAs were not able to transfer the data from the sensor to the memory I allocated in userspace.

I had also to change the struct variable "mxc_capture_inputs" inside mxc_v4l2_capture.c. For capture input 0 I had to change the "name" member  from

"CSI MEM" to "CSI IC MEM".

I have another question though. As I understand, the IPU (IC) can convert lets say UYUV images to RGB24 during the capturing process, right?

How can I trigger this conversion?

Thanks you.

0 Kudos

540 Views
igorpadykov
NXP Employee
NXP Employee

Hi Andre, had you checked sensor with csi_v4l2_capture test or SDK ?

0 Kudos