How to use GStreamer with OpenCV

cancel
Showing results for 
Search instead for 
Did you mean: 

How to use GStreamer with OpenCV

24,093 Views
Contributor II

Hi all,

I'm working with Sabre Lite board and the OV5640 camera module, both from Boundary Devices, and I need to process video using OpenCV.

But I was informed that this camera can't work with OpenCV because its driver works with a Freescale version of V4L2 and not with the standard one.

I found some discussions about use GStreamer with OpenCV (with Qt and OpenGL too) in others cases.

I would like to know if it's possible to use GStreamer's library to capture video from this camera and transfer the frames to OpenCV for a video processing.

Thanks,

Bruno

Labels (3)
12 Replies

120 Views
Contributor I

Hi Bruno,

Did you find a way to solve this problem?

I have the same problem recently in my project, and I'm consider about using gstreamer to send a rtp stream and using OpenCV to capture the stream. I'm still doing some research and find out some way to do that.

regards,

Zizhao

0 Kudos

120 Views
NXP Employee
NXP Employee

I made it using V4L, and also provide a sample here:

http://imxcv.blogspot.com.br/2014/05/onboard-camera-v4l-wrapper-for-use-with.html

regards,

Andre

120 Views
Contributor II

Hi André,

Sorry for being late, but I was busy and I had to use a USB camera for my project.

I had seen this code on your blog and I got some errors when I tried to compile it.

Now I have time to read better your code and I will test it on Sabre Lite.

Thank you for your work,

Bruno

0 Kudos

120 Views
Contributor I

120 Views
Contributor II

Hi Zizhao,

sorry for being late. I found some topics on Stack Overflow's site when I was searching for a solution, but I didn't found this one.

I will test it too.

Thank you for the contribution.

Bruno

0 Kudos

120 Views
NXP Employee
NXP Employee

excellent =)

0 Kudos

120 Views
NXP Employee
NXP Employee

Hi Bruno,

Yes you can do that, you can create a GST pipeline and get the frame data using it, since you convert it to the correct format OpenCV understands, it will work perfectly fine.  Otherwise the onboard camera will not work, only the USB (UVC based drivers).

regards,

Andre

0 Kudos

120 Views
Contributor II

Hi André,

I think this is the question. Maybe the problem is format conversion or I'm using the pipeline in a wrong way.

I created a pipeline with mfw_v4lsrc and fakesink. I used a callback function to get a buffer and transfer the data to OpenCV formats, but I got a null pointer for IpLImage. I've had to solve others problems not related with this case, then I didn't proceed with it.

But now I'm back and I will try to solve it.

Thanks for the reply,

Bruno

0 Kudos

120 Views
NXP Employee
NXP Employee

I will try writing a code like that myself too, maybe will take some time, but I let you know.

regards,

Andre

0 Kudos

120 Views
Specialist I

Bruno,

would this help ?

Manually adding or removing data from/to a pipeline

This element appsrc takes pushes buffer into a pipeline (in your case it would push your camera captured  & processed data) then you can connect it to a sink element (mfw_*). I have not used that element before but worth trying.

Leo

0 Kudos

120 Views
Contributor II

Hi Leonardo,

It's the opposite, I'm trying to use Gstreamer pipeline to capture frames from camera and create a buffer to transfer to my OpenCV application. It's because I'm using a OV5640 camera and this camera doesn't work with V4L2 standard library.

In the case that you showed, I could use the appsink and get a buffer (I found a code that do it), then I could transfer data to opencv, but the appsink doesn't work (and I saw some topics about similar problems with appsink).

So I think a simple way to get frames is through the gstreamer (camera -> gstreamer -> opencv), but I'm learning how to use gstreamer right now.

I found this application: https://github.com/andreluizeng/gesture-recognition/tree/master/src

and it's similar of what I need to do, but instead of get data from a video file, I will get from camera. Please, Take a look at main.cpp and netplayer_gst.cpp, these two files contain all code using gstreamer. I'm working on this code now.

I intend to create a pipeline with mfw_v4lsrc and fakesink (I don't need to show the frames on a display, only for a test case and with opencv, because it's for a robot displacement control). I think I must create a pad between these plugins to get a buffer.

As soon I get some result I will post the code here.

Thanks for the reply,

Bruno

0 Kudos

120 Views
Contributor II

I found a way to transfer a frame from GStreamer to OpenCV (transfer from a GstBuffer into a OpenCV image structure), but it uses the Appsink and I don't have it installed on my image and I think it doesn't run with mfw_v4lsrc.

Reading the GStreamer manual, I found about ghost pads, so I tried to create a bin with mfw_v4lsrc conected to this pad so I could get a buffer using the function in gstbufferstraw.h. But I got these errors:

MFW_GST_V4LSRC_PLUGIN 3.0.8 build on Oct  3 2013 21:00:55.

(camera:3606): GStreamer-CRITICAL **: gst_ghost_pad_new: assertion `GST_IS_PAD (target)' failed

(camera:3606): GStreamer-CRITICAL **: gst_element_add_pad: assertion `GST_IS_PAD (pad)' failed

(camera:3606): GStreamer-CRITICAL **: gst_pad_add_buffer_probe_full: assertion `GST_IS_PAD (pad)' failed

check_msg.c:79: No messaging setup

The source code is in camera.c file.

I need only a way to get a buffer from a pipeline.

Thanks,

Bruno

0 Kudos