How can ı use gstream with OpenCV

キャンセル
次の結果を表示 
表示  限定  | 次の代わりに検索 
もしかして: 

How can ı use gstream with OpenCV

4,796件の閲覧回数
yaşarcani_tişge
Contributor II

Hi,

I am using İ.MXQ6sabresd development board. I want to use gstreamer with Opencv. I want to capturing and processing video frames with OpenCV .  But my code did not work.How can ı do it ? My code:

#include "opencv2/objdetect/objdetect.hpp"
#include "opencv2/highgui/highgui.hpp"
#include "opencv2/imgproc/imgproc.hpp"
#include <iostream>
#include <stdlib.h>


using namespace std;
using namespace cv;

String window_name = "Face Detection";

void detectFaces(Mat frame, int frameno) {

CascadeClassifier face_cascade;
face_cascade.load("/usr/examples/facevide/haarcascade_frontalface_alt.xml");
CascadeClassifier eye_cascade;
eye_cascade.load("/usr/examples/facevide/haarcascade_eye.xml");
std::vector<Rect> faces;

face_cascade.detectMultiScale( frame, faces, 1.1, 2, 0|CV_HAAR_SCALE_IMAGE,
Size(30, 30) );
std::vector<Rect> eyes;
eye_cascade.detectMultiScale( frame, eyes, 1.1, 2, 0 |CV_HAAR_SCALE_IMAGE,
Size(100, 100) );
std::ostringstream name;
name << "rotated_im_" << frameno << ".jpg";
for( int i = 0; i < faces.size(); i++ ){
Point center(faces[i].x + faces[i].width/2, faces[i].y + faces[i].height/2);
ellipse(frame, center, Size(faces[i].width/2, faces[i].height/2), 0, 0, 360, Scalar( 255, 0,
255 ), 4, 8, 0 );
for( int j = 0; j < eyes.size(); j++ )
 {
Point eyes_center( eyes[j].x + eyes[j].width/2, eyes[j].y + eyes[j].height/2 );
int radius = cvRound( (eyes[j].width + eyes[j].height)*0.25 );
circle( frame, eyes_center, radius, Scalar( 255, 0, 0 ), 4, 8, 0 );
}
}
imwrite(name.str(), frame); // Display frame
}
int main() {
//VideoCapture cap("gst-launch-1.0 imxv4l2src ! autovideosink ! appsink");
VideoCapture cap("gst-launch-1.0 autovideosrc ! autovideosink");
//VideoCapture cap(" mfw_v4lsrc ! ffmpegcolorspace ! video/x-raw-rgb ! appsink");
//VideoCapture cap("v4l2src ! video/x-raw, framerate=30/1, width=640, height=480, format=RGB ! videoconvert ! appsink");
//const char* pipe =  "rtspsrc location=\"rtsp://192.168.0.220:554/user=admin&password=admin&channel=1&
VideoCapture cap("gst-launch mfw_v4lsrc num-buffers=1 !  jpegenc ! filesink location=sample.jpeg");

Mat frame;
int i = 0;
while(cap.read(frame)) {
detectFaces(frame, i);
i++;
                          }
                     }

Thank you.

Yasarcan

 
ラベル(4)
7 返答(返信)

2,551件の閲覧回数
yaşarcani_tişge
Contributor II
 
I use a camera connected to the board. (Camera: MIPI OV5642)
0 件の賞賛
返信

2,551件の閲覧回数
joanxie
NXP TechSupport
NXP TechSupport

did you use ov5642 or ov5640? ov5640 is mipi, pls comfirm this,

0 件の賞賛
返信

2,551件の閲覧回数
yaşarcani_tişge
Contributor II

I tried them. But it does not work. I made some changes to my code. Not working again. joanxie

    for(;;)
    {
      Mat frame;
      int i = 0;
      printf("for dongusu\n");
   //  while(cap.read(frame)) {
     cap >> frame;
     if (frame.empty()) {
         printf("no frame\n");
         break;
     printf("while dongusu\n");
      deneme(frame, i);
      i++;
    }
  }
}

I tried more than one option but did not work at all

//string gst_pipe = "videotestsrc ! videoconvert ! appsink";
    //VideoCapture cap(gst_pipe);
    //VideoCapture cap( "v4l2src ! filesink location=file.jpg" );
    //VideoCapture cap(" mfw_v4lsrc ! videoconvert ! appsink");
    VideoCapture cap("-v udpsrc port=5000 caps=application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)RAW, sampling=(string)BGR, depth=(string)8, width=(string)640, height=(string)480 ! rtpvrawdepay ! videoconvert ! ximagesink");
     // VideoCapture cap( "VSALPHA=1 gst-launch-1.0 filesrc location=<filename> ! jpegdec ! imagefreeze ! autovideosink" );
    //VideoCapture cap(" mfw_v4lsrc ! mfw_ipucsc ! appsink");
    //VideoCapture cap(" mfw_v4lsrc num-buffers=1 !  jpegenc ! filesink location=sample.jpeg");
    //VideoCapture cap("autovideosrc ! autovideoconvert ! appsink");
    //VideoCapture cap( "autovideosrc ! video/x-raw,format=YUY2 ! videoconvert !appsink" );
    //VideoCapture cap( "v4l2src num-buffers=1 ! ffmpegcolorspace ! video/x-raw-yuv,width=640,height=480,framerate=8/1 ! jpegenc ! filesink location=test.jpg  " );
    //VideoCapture cap("mfw_v4lsrc ! ffmpegcolorspace ! video/x-raw-rgb ! appsink");
    //VideoCapture cap("v4l2src num-buffers=1 ! ffmpegcolorspace ! video/x-raw-yuv,width=640,height=480,framerate=8/1 ! jpegenc ! filesink location=test.jpg ");
    //VideoCapture cap("v4l2src ! jpegdec ! ffmpegcolorspace ! ximagesink");
    //VideoCapture cap("-v imxfslv4l2src device=/dev/video1 ! imxv4l2sink");
    //VideoCapture cap("imxv4l2src num-buffers=1 device=/dev/video1 ! jpegenc ! filesink location=picture.jpg");
    //VideoCapture cap(" imxv4l2src num-buffers=1 device=/dev/video1 ! video/x-raw, format=(string)YUY2, width=(int)2592, height=(int)1944, framerate=15/1 ! jpegenc ! filesink location=test.jpg" );

0 件の賞賛
返信

2,551件の閲覧回数
yaşarcani_tişge
Contributor II

I tried  " VideoCapture cap("autovideosrc !  autovideoconvert ! appsink ");  but it doesn`t work. How else can I turn it ? joanxie

0 件の賞賛
返信

2,551件の閲覧回数
joanxie
NXP TechSupport
NXP TechSupport

you also can try to use

"VideoCapture cap("mfw_v4lsrc ! ffmpegcolorspace ! video/x-raw-rgb ! appsink")" for converting yuv to rgb format
0 件の賞賛
返信

2,551件の閲覧回数
joanxie
NXP TechSupport
NXP TechSupport
do you use webcam? try to use the command as below:
  • gst-launch-1.0 -v udpsrc port=5000 caps="application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)RAW, sampling=(string)BGR, depth=(string)8, width=(string)640, height=(string)480" ! rtpvrawdepay ! videoconvert ! ximagesink
0 件の賞賛
返信

2,551件の閲覧回数
joanxie
NXP TechSupport
NXP TechSupport

the camera output is yuv, you should convert to the yuv to pass opencv, try to use rgb format and try again