Adding a video sensor for CSI

Showing results for 
Search instead for 
Did you mean: 

Adding a video sensor for CSI

Contributor II

I have been working on adding 2 sensors for video input using CSI0.

First is porting a driver for gs29xx for device tree, based on the gs2971 driver from 3.0.35...

Second is to implement a self-test driver to activate the CSI test pattern when included as a camera...

I think I found a bug/design issue in mxc_v4l2_capture.c

There are several places which make use of 'struct sensor_data' in this file.

This is the PRIVATE data for the v4l2 internal device.

It seems to me that there is no good reason for mxc_v4l2_capture to know about the privates of a v4l2_int_device, much less for it to require a specific structure for those privates...

Forcing the use of this structure seems to pretty much exclude the use of anything except an ov564x device.

In my situation, existing members of struct sensor_data are irrelevant, and I do need members which are not at all relevant to either mxc_v4l2_capture, nor ov564x cameras.

Am I overlooking something simple?

It seems to me that the members of struct sensor_data used in the fragments below should, instead, be part of v4l2_int_device.

This suggested solution would move the private members io_init and mclk_source and sensor_clk up to v4l2_int_device.

This SHOULD be fine, since the slave v4l2_int_device initializes its own v4l2_int_device structure in probe(...)

The uses of this are in:

Note that in each of the following cam->sensor is a  struct v4l2_int_device* sensor;


  sensor_data = cam->all_sensors[c->value]->priv;

  if (sensor_data->io_init)


  cam->mclk_source = sensor_data->mclk_source;

If a new sensor actually used its own private data structure then the above would result in:

1) An uncontrolled call into an unknown dataspace

2) Assignment of an uncontrolled pointer to be used as a clock structure pointer...


  sensor = cam->sensor->priv;



   sensor = cam->sensor->priv;


If a new sensor actually used its own private data structure then the above 2 would result in:

1) A clock driver operation on an uncontrolled pointer to an unknown data space.

My issue here is not that v4l2_int_device should be unknown here, in fact it must be known here.

The issues is that v4l2_int_device.priv should not be defined here, but should be private for each potential sensor type.

Anything used here should be part of the v4l2_int_device structure which is known on both ends of the driver configuration.

7 Replies

Contributor II

The link you gave is for another 3.0.35 driver.  It's not too different from the hacky one that I took some guidance from.

It, however, did not follow all of the guidelines in the manual, and like the one you provided a link for it made modifications in other driver code rendering the drivers incompatible with the use of the normal camera or capture drivers.

My goal was to create one for 3.10.17 which does not break the system for cameras.  Basically trying to make the test pattern just another device.  The problem with that goal is a number of bad assumptions made in mxc_v4l2_capture.

In the end, though, I think there is only one of the modifications I made which might break it for the use of a camera like the OV5642.  Someone with that device in their system would have to do that test since I don't have one.

I tried to make a clean dizzy setup to share this and ran into some unexpected problems.  Can anyone shed light on why Dizzy backed off the Kernel version?  In any case, the Daisy version of GStreamer simply does not work well for me and I need that from Dizzy - so I created a layer and verified its function with a clean build.

I will try to attach a tgz archive of that layer to this post to share it.  There is an open office read-me document in the root folder and also a document with plain text notes of my clean-build test steps...

Contributor II

Hi Steve,

The attached file "meta-sqs-pixtest.tgz" did help me. Tested this quickly in 2hrs and it went well.

How I would make it work in Android is another headache, but to get this up and running was made easy by your post.

BTW, thank you for this and more thanks for taking out time to draft the document to explain the matter.

Really appreciate the efforts.



0 Kudos

Contributor II

After a couple more days to reflect on what I reported, and design my work-arounds, it is worth sharing the work-arounds I am using now, and a more detailed suggested solution...

Beyond what was reported above, I wanted to be able to use the test video source on, say IPU1/CSI1 while I might have a mipi camera on IPU1/CSI0 or IPU1/CSI1, and our GS2960 on IPU0/CSI0.

What I further discovered is that the module mxc_v4l2_capture is pretty much still written around having one IPU.

For example, the master_attach(...) function checks (the slaves privates) for the desired/expected/designed CSI, but not for the IPU.  If I have a mipi camera which wants to attach to IPU1/CSI0, this software might happily try to connect it to IPU0/CSI0 if nothing is already there.

In the current implementation all 4 potential mxc video input sources are all called "Mxc Camera" with nothing to distinguish IPU and CSI...  I have implemented a work-around which is detailed below.

I understand the desire not to muck too much with the v4l2 core.  This is code shared broadly within Linux and apparently does what is needed well enough.

Instead of changing v4l2_int_device to carry MXC specific data, I think a better platform work-around might be to extend it something like this:

Create an extended structure for v4l2 master and slave devices which attach through the mxc platform:

struct mxc_v4l2_master {
struct v4l2_master v4lmaster;
struct clock*  my_master_clock;
int   my_master_member;

struct mxc_v4l2_slave {
struct v4l2_slave v4lsensor;
struct clock*  my_sensor_clock;
int   my_sensor_member;

Again, this is only for devices attaching through the IPUx/CSIy inputs on an mxc platform.

Also, I suggest moving away from statically instantiated module data.  It's an old idea and fundamentally limited.

My driver data, except for the IOCTL table, is dynamically allocated:

struct my_v4l2_int_device {
struct my_sensor_privates actually_private_data;
struct mxc_v4l_2slave  platform_slave_data;
struct v4l2_int_device  v4l2dev_data;

In this suggestion anything used by mxc_v4l2_capture(etc.) that is in sensor_data would be in struct mxc_v4l_2slave now.

This is a structure both the mxc platform v4l2 driver knows about, but that each added sensor also knows about.

It does not break the basic v4l2 core code, but allows the platform to extend valuable shared information needed by capture driver and sensor driver alike, but not waste space with useless members and preserves driver-private data.

Separately, The work-around I am debugging now is in two parts:

First confronts the complete ambiguity of what v4l2 master device I want to attach to, and this, at least, is mind-numbingly simple:

In init_camera_struct(...) I add the following:

*(cam->video_dev) = mxc_v4l_template;
/* SDA-CCC - Let's name this device a bit more specifically:
** .name = "Mxc Camera" becomes "Mxc[IPU]Camera[CSI]"
cam->[10] = 0; /* Ensure string terminator */
cam->[9] = '0'+csi_id; /* CSI ID after 'Camera' */
cam->[3] = '0'+ipu_id; /* IPU ID after 'Mxc' */

After the template is copied, I name the input specifically to its IPUi/CSIc port as MxciCamerac

Now, in my devices probe(...) function I can populate the v4l2_int_slave.attach_to member with the actual name of the IPUx/CSIy input I have built.

Second works around the need for the mostly useless, to me, struct sensor_data which mxc_v4l2_capture(etc.) depends upon as my device private data:

struct my_v4l2_int_device {
struct sensor_data  visible_sensor_privates;
struct my_sensor_privates actually_private_driver_data;
struct v4l2_int_2slave  v4l2_slave_data;
struct v4l2_int_device  v4l2_device_data;

The pointer to this is stored in the v4l2_int_device.priv member, and naturally points to what mxc_v4l2_capture(etc.) seems to need.

I populate what I must, and what I can use of struct sensor_data and put the rest of what I need into struct my_sensor_privates.

Since one device is a platform device, and the other is an SPI device, the I2C stuff in struct sensor_data is totally useless.

My inputs don't have brightnes or contrast controls, so most of that other stuff is also totally useless.

I put the other controls my device actually needs in struct my_sensor_privates.

I populate struct v4l2_int_2slave to specifically attach to the input it is connected to by name, without change to v4l2 core software.

I register struct v4l2_int_device as dynamically allocated instead of static data - so I can have more than one of a thing, if desired.

So, for example, I can replace any actual camera with the internal self-test generator of the CSI by just changing the device tree.

It is just making the best I can of a design which violates the concept of driver-private data, but is well on its way to working.

0 Kudos

NXP Employee
NXP Employee

Hi Steve,

Did you get the application working?

Have you tried looking for a driver specifically for the GS2960 that could help for an easier integration?

Hope this will be useful for you.
Best regards!

Note: If this post answers your question, please click the Correct Answer button. Thank you!

0 Kudos

Contributor II

Not quite working - and some odd and unexpected problems:

First RE the gs2960 - the closest I could find is the gs2971 driver for 3.0.35, the chips are virtually identical except mainly for registers supporting audio.  I have created what should prove to be a general driver to support all 8 variations of the same basic chip (gs2960/61/70/71/A), but I only have the gs2960 to test with.

The SPI interface clearly shows I am contacting and initializing the gs2960 correctly, and that it is locking onto my test video stream as well.  It should be providing parallel video data, vsync, hsync, and pclk, but I am not getting frames buffered.  I consider most of that to be an issue to work out with semtech - unless there really is an issue or quirk with using CSI0, of course.

That is why I decided to try the test pattern generator, figuring that if the problem is a hardware issue then software can work from the CSI forward with the CSI video test pattern.

The test pattern does work on the SabreSD on IPU0 and either CSI0 or CSI1, but when I try to set up to test IPU1 it seems that udev hangs the system up - however...  Even when I stub out udev the driver hangs when it invokes the code to start up the test pattern generator.

But I don't know if any of this has ever been tested by anyone to use IPU1 with either CSI, so I can't say at this point what the problem is.  The design for mxc_v4l2_capture is, at best, weak in all the issues I raised above.

I do have a basically working driver for the CSI Test Pattern Generator, though.  That can help me move forward a little bit with software while still working on the gs29xx driver issue.

My client said I can share the test pattern driver and my related enhancements to mxc_v4l2_capture, and I will do so here shortly - I hope later this week - when I put a package with documentation together.

0 Kudos

NXP Employee
NXP Employee

Hi Steve,

As these sensors are not supported on Freescale BSP, we also don't have any SW (driver or user space app) to test the CSI pattern test. Maybe the thread below can help you. See the file i.MX53 attached on the thread:

Hope this will be useful for you.
Best regards!

Note: If this post answers your question, please click the Correct Answer button. Thank you!

0 Kudos

Contributor II

Hi Carlos, That driver isn't much help, but it isn't far from what gave me the idea.

I put my full reply back on the main branch - which seemed to me to make sense since I am sharing a yocto layer archive which contains everything to build the driver I do have working now.

I did have to make several changes in the mxc_v4l2_capture module.  Many of them were to support GStreamer IOCtl calls.  Some were to support use of the second IPU which is not strongly supported as released.  I think only one of these changes I made might potentially break it for use by a camera.  I don't have the equipment to test that, however.

I would be happy to work with someone who knows the freescale internals of this platform driver to come up with a solution which can be shared in the BSP and provide useful improvements to the related drivers.

In the open office readme I included with the archive I shared is a picture showing the test pattern on one of my monitors, but some very simple things which I think should work are causing it to break.  I am going to do some tests today which I hope will help isolate where the break is coming from - or at least some places it is not coming from...

0 Kudos