Interfacing Depth Sensors with ROS on NXP i.MX6 boards

取消
显示结果 
显示  仅  | 搜索替代 
您的意思是: 

Interfacing Depth Sensors with ROS on NXP i.MX6 boards

Interfacing Depth Sensors with ROS on NXP i.MX6 boards

About this document

This document describe the setup detail for Interfacing, Installing, programming (basis) and testing depth cameras with MX6QDL based boards using the Robotic Operating System (ROS). If you are not using ROS you can also install the proper drivers and compile, in your Ubuntu system as explained on document:  https://community.freescale.com/docs/DOC-330278

1. Software & Hardware requirements

Supported NXP HW boards:

  • i.MX 6QuadPlus SABRE-SD Board and Platform
  • i.MX 6Quad SABRE-SD Board and Platform
  • i.MX 6DualLite SABRE-SD Board
  • i.MX 6Quad SABRE-AI Board
  • i.MX 6DualLite SABRE-AI Board
  • Depth sensors tested: Microsoft Kinect, ASUS Xtion.

Software:   Gcc, Ubuntu 14.04v, OpenCV, Openni, Python, ROS.

2. Installation on ROS

For installation steps of ROS on iMX6 boards in your board, please follow up:

https://community.freescale.com/docs/DOC-3301478

Before you can use ROS, you will need to initialize rosdep. It enables you to easily install system dependencies for source you want to compile and is required to run some core components in ROS.

$ sudo rosdep init

$ rosdep update

There are many different libraries and tools in ROS - not all compile fully on ARM. In this case we already have installed the ROS Base, however any other packages need to be installed individually. First install the following dependencies, which will take some time and space (~1.4 GB):

$ sudo apt-get install --no-install-recommends freeglut3-dev libfreenect-dev libusb-1.0-0-dev
 libudev-dev ros-indigo-camera-info-manager ros-indigo-dynamic-reconfigure 
ros-indigo-image-transport ros-indigo-image-proc ros-indigo-depth-image-proc ros-indigo-tf 
ros-indigo-openni-launch ros-indigo-freenect-* ros-indigo-depthimage-to-laserscan 
ros-indigo-image-view ros-indigo-camera-info-manager ros-indigo-dynamic-reconfigure 
libudev-dev doxygen graphviz openjdk-6-jdk ros-indigo-openni2-camera 
ros-indigo-openni2-launch ros-indigo-rqt-common-plugins ros-indigo-rqt-graph

There is no any additional installation to run kinect with ROS, If you are using kinect you can pass to part 4. However the packages used to run the PrimeSense / Asus Xtion on the i.Mx6 are not available over apt yet, so they need to be compiled from source. To use OpenNI2 with ROS, we only need the shared OpenNI2 libraries and the Drivers.

Clone OpenNI2

$ git clone https://github.com/OpenNI/OpenNI2
$ cd OpenNI2

Edit ThirdParty/PSCommon/BuildSystem/Platform.Arm

$ nano ThirdParty/PSCommon/BuildSystem/Platform.Arm

and replace

CFLAGS += -march=armv7-a -mtune=cortex-a9 -mfpu=neon -mfloat-abi=softfp #-mcpu=cortex-a8

with

CFLAGS += -march=armv7-a -mtune=cortex-a9 -mfpu=neon -mfloat-abi=hard

Add support for pthread library:

$ nano ThirdParty/PSCommon/BuildSystem/CommonCppMakefile

Search the line 95 and add the code between the two lines:

OUTPUT_NAME = $(EXE_NAME)                                              

    # We want the executables to look for the .so's locally first:
    LDFLAGS += -Wl,-rpath ./
 
+   ifneq ("$(OSTYPE)","Darwin")
+       LDFLAGS += -lpthread
+   endif
 
    OUTPUT_COMMAND = $(CXX) -o $(OUTPUT_FILE) $(OBJ_FILES) $(LDFLAGS)
 endif

Save the file and exit

Then run make to compile the OpenNI2 drivers and libraries

$ PLATFORM=Arm make ALLOW_WARNINGS=1

Once the compilation is done, run the linux install script

$ cd Packaging/Linux
$ sudo ./install.sh

Copy libraries and includes to the system paths

$ cd ../../
$ sudo cp -r Include /usr/include/openni2
$ sudo cp -r Bin/Arm-Release/OpenNI2 /usr/lib/
$ sudo cp Bin/Arm-Release/libOpenNI2.* /usr/lib/

Create a package config file

$ sudo nano /usr/lib/pkgconfig/libopenni2.pc

and fill it with this:

prefix=/usr
exec_prefix=${prefix}
libdir=${exec_prefix}/lib
includedir=${prefix}/include/openni2
 
Name: OpenNI2
Description: A general purpose driver for all OpenNI cameras.
Version: 2.2.0.0
Cflags: -I${includedir}
Libs: -L${libdir} -lOpenNI2 -L${libdir}/OpenNI2/Drivers -lDummyDevice -lOniFile -lPS1080.so

This will enable ubuntu to find the location of the drivers, libraries and include files. To make sure it is correctly found, run

$ pkg-config --modversion libopenni2

Which should give the same version as defined in the file above (2.2.0.0). Now the Xtion is ready to be used. Plug it in (if it is already, unplug it first), then run the sample program

$ ./Bin/Arm-Release/SimpleRead

Then create a catkin workspace as described here, and check out the following packages in the src folder of the catkin workspace:

$ cd ~/catkin_ws/src
$ git clone https://github.com/ros-drivers/openni2_camera
$ git clone https://github.com/ros-drivers/openni2_launch
$ git clone https://github.com/ros-drivers/rgbd_launch

Now the ros packages checked out above to the catkin workspace can be compiled with catkin_make

$ cd ~/catkin_ws
$ catkin_make

Once the packages are compiled, the Xtion is ready for use with ROS with

3. Testing The Installation

  1. Kinect. Open at least 3 bash terminals:

Terminal 1: Run ROS

$ roscore

Terminal 2:  launch the Freenect

$ roslaunch freenect_launch freenect.launch

Terminal 3: run the image capture
$ rosrun image_view image_view image:=camera/rgb/image_color
or: 
$ rosrun image_view image_view image:=camera/rgb/image_rect_mono
or: 
$ rosrun image_view disparity_view image:=camera/depth/disparity

It will open a new terminal with the rgb points, mono  and depth images  from the Kinect.

pastedImage_73.png

Xtion. 

Terminal 1: Run ROS

$ roscore

Terminal 2:  launch Openni2

$ roslaun openi2_launch openni2.launch

Terminal 3: you can use rqt or Rviz session to visualize the sensor e.g:

$ rqt

or

$ rosrun rqt_graph rqt_graph

In the “rqt” window select “Plugins” -> “Visualization” -> “Image View“

                                                    pastedImage_88.png   pastedImage_89.png

     (optional) Install PySide, in any case you get an error with python rqt graph:

$ pip install PySide
$ cd ~/

Note: rqt and Rviz demand a lot of i.MX GPU work, so general graphic functionality will be affected. For this case is suggested to run rviz in a remote Network ROS session.

References:

-       www.ros.org

-     https://dobots.nl/2014/05/05/asus-xtion-using-openni2-and-ros-on-udoo/

无评分
版本历史
最后更新:
‎09-10-2020 03:08 AM
更新人: