i.MX Processors Knowledge Base

cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 

i.MX Processors Knowledge Base

Discussions

Sort by:
This document provide an overall guide how to get started with i.MX6 development. There are several chapters: 1. how to get necessary docs from freescale website; 2. how to setup environment and build your own images;3. Hardware design consideration;4. How to get help. I hope the doc will bring you in i.MX world more easily, and hope you all have a fun in it.
View full article
This is the procedure and patch to set up Ubuntu 13.10 64bit Linux Host PC and building i.MX6x L3.0.35_4.1.0. It has been tested to build GNOME profile and with FSL Standard MM Codec for i.MX6Q SDB board. A) Basic Requirement: Set up the Linux Host PC using ubuntu-13.10-desktop-amd64.iso Make sure the previous LTIB installation and the /opt/freescale have been removed B) Installed the needed packages to the Linux Host PC $ sudo apt-get update $ sudo apt-get install gettext libgtk2.0-dev rpm bison m4 libfreetype6-dev $ sudo apt-get install libdbus-glib-1-dev liborbit2-dev intltool $ sudo apt-get install ccache ncurses-dev zlib1g zlib1g-dev gcc g++ libtool $ sudo apt-get install uuid-dev liblzo2-dev $ sudo apt-get install tcl dpkg $ sudo apt-get install asciidoc texlive-latex-base dblatex xutils-dev $ sudo apt-get install texlive texinfo $ sudo apt-get install lib32z1 lib32ncurses5 lib32bz2-1.0 $ sudo apt-get install libc6-dev-i386 $ sudo apt-get install u-boot-tools $ sudo apt-get install scrollkeeper $ sudo apt-get install gparted $ sudo apt-get install nfs-common nfs-kernel-server $ sudo apt-get install git-core git-doc git-email git-gui gitk $ sudo apt-get install meld atftpd $ sudo ln -s /usr/lib/x86_64-linux-gnu/librt.so   /usr/lib/librt.so C) Unpack and install the LTIB source package and assume done on the home directory: $ cd ~ $ tar -zxvf L3.0.35_4.1.0_130816_source. tar.gz $ ./L3.0.35_4.1.0_130816_source/install      After that, you will find ~/ltib directory created D) Apply the patch to make L3.0.35_4.1.0 could be installed and compiled on Ubuntu 13.10 64bit OS $ cd ~/ltib $ git apply 0001_make_L3.0.35_4.1.0_compile_on_Ubuntu_13.10_64bit_OS.patch What the patch is doing: a) The patch modifies the following files:    dist/lfs-5.1/base_libs/base_libs.spec    dist/lfs-5.1/m4/m4.spec    dist/lfs-5.1/ncurses/ncurses.spec b) Add the following files to the pkgs directory:    pkgs/m4-1.4.16-1383761043.patch    pkgs/m4-1.4.16-1383761043.patch.md5 E) Then, it is ready to proceed the rest of the LTIB env setup process: $ cd ~/ltib $ ./ltib -m config $ ./ltib Reference: L3.0.35_4.1.0_130816_docs/doc/mx6/Setting_Up_LTIB_host.pdf https://community.freescale.com/message/332385#332385 https://community.freescale.com/thread/271675 https://community.freescale.com/message/360556#360556 m4 compilation issue: 1. https://github.com/hashdist/hashstack/commit/f6be2a58de62327d05e052d89c9aa931d4c926b3 2. https://github.com/hashdist/hashstack/issues/81 rpm-fs package failed to build issue: https://community.freescale.com/message/355771#355771 scrollkeeper is for the gnome-desktop compilation NOTE: When compiling gstreamer, this warning was pop up.  Just ignore it seems okay.
View full article
[中文翻译版] 见附件   原文链接: https://community.nxp.com/docs/DOC-343715 
View full article
The original implementation is from Frias Renato for Sabreauto board. How to define the booting time? The booting time we defined here is from the board be powered up to the main application working and main application be showed directly to the end user, for example: for the media play purpose board, the booting time count to the first video frame be shown on the screen. For minimizing the booting time, some methods be tried. Optimizing for performance. Remove unnecessary modules at boot time. Start main application at the first time after the kernel be boot up. Optimizing for performance: U-Boot:   1:Enable MMU and L2-Cache.   2:Optimizing memset and memcpy.   3:Implementation of SDMA, accelerate copying data from NOR flash to memory.   4:Implementation of uSDHC’s ADMA, improve performance for SD card read. Kernel:   1:Optimizing _memcpy_fromio function at  arch/arm/kernel/io.c Remove unnecessary modules: U-Boot:   1: Disable uart output at u-boot procedure and add quiet parameter to Kernel boot.   2: Remove boot up delay at u-boot.   3: Disable I2C, SPI, SPLASH_SCREEN at u-boot. Kernel: Below removing unnecessary modules just for Sabresd board boot up through SD card and MIPI camera overlay on LVDS screen application, for other special board and special board usage application please don’t use below directly.   1: Modify arch/arm/mach-mx6/board-mx6q_sabresd.c just keep necessary module initialization at  mx6_sabresd_board_init : iomux configuration, uart0, voda, ipu, ldb, v4l2, mipi-csi, i2c1, uSDHC1, pwm0, dvfs, mipi camera clock.   2: Update Linux kernel configuration file. Try to just keep necessary module and configuration to keep minim size. Build necessary modules from external to Kernel itself. Create uImage from Image instead of zImage to reduce Kernel self extraction time. Use ext4 file system on SD card to accelerate rootfs mounting.    Notice: Kernel configuration remove NETWORK support, it includes Unix Domain Socket, the udev mechanism need it, so this kernel configuration can't support rootfs udev dynamic /dev/ nodes and all /dev/ nodes must be created before boot up at rootfs. Start main application at the first time after the kernel boots up. As normal boot up procedure, the init process will handle sysinit script firstly, this script will do some initialization and preparation for most of the user process, But this script normally will be executed for about 1~5 seconds, so now try do main application before the sysinit, while the necessary preparation of main application will be handle by this application internally. See below example for MIPI camera overly on LVDS screen: /etc/inittab ::once:/unit_tests/mxc_v4l2_overlay.out -iw 640 -ih 480 -it 0 -il 0 -ow 1024 -oh 768 -ot 0 -ol 0 -r 0 -t -1 -d 0 -fg -fr 30 ::once:/etc/rc.d/rcS ::once:/sbin/getty -L ttymxc0 115200 vt100 # GENERIC_SERIAL Test result of fast boot on Sabresd board for MIPI camera overly on LVDS screen: The main application be executed from the board be powered up is about 958ms.    Running Bootloader [0.356417 0.356417] [ 0.046637] _regulator_get: get() with no identifier [ 0.958425  0.602008] starting pid 21, tty '': '/unit_tests/mxc_v4l2_overlay.out -iw 640 -ih 480 -it 0 -il 0 -ow 1024 -oh 768 -ot 0 -ol 0 -r 0 -t -^@ [0.969609 0.011184] starting pid 22, tty '': '/etc/rc.d/rcS' [0.973368 0.003759] g_display_width = 1024, g_display_height = 768 [0.977540 0.004172] g_display_top = 0, g_display_left = 0 [0.980927 0.003387] starting pid 23, tty '': '/sbin/getty -L ttymxc0 115200 vt100 ' [1.048454 0.067527] Mounting /proc and /sys [1.089526 0.041072] Setting the hostname to freescale [1.116635 0.027109] Mounting filesystems [1.527320 0.410685] sensor chip is ov5640_mipi_camera [1.530627 0.003307] sensor frame size is 640x480 [1.533482 0.002855] sensor frame format is UYVY [1.640221 0.106739] frame_rate is 30 [1.642249 0.002028] [1.642270 0.000021] frame buffer width 0, height 0, bytesperline 0 [1.989728 0.347458] [1.990761 0.001033] arm-none-linux-gnueabi-gcc (Freescale MAD -- Linaro 2011.07 -- Built at 2011/08/10 09:20) 4.6.2 20110630 (prerelease) [2.001161 0.010400] root filesystem built on Tue, 11 Sep 2012 11:43:24 +0800 [2.006249 0.005088] Freescale Semiconductor, Inc. [2.009394 0.003145] [2.009531 0.000137] freescale login: Please see below fast boot video. I also attached sample code for U-boot and kernel for your reference. Patch code based on L3.0.35_12.09.01_GA.
View full article
The i.MX 6 D/Q L3.035_1.0.3 patch release is now available on www.freescale.com ·         Files available # Name Description 1 L3.0.35_1.0.3_TEMP_PFD_PATCH This patch release is based on the i.MX 6Dual/6Quad Linux   12.09.01 release. The purpose of this patch release is update thermal sensor   calibration routine and correct the PFD workflow in U-Boot. More details in   the release notes.
View full article
gst-launch is the tool to execute GStreamer pipelines. Task Pipeline Looking at caps gst-launch -v  <gst elements> Enable log gst-launch --gst-debug=<element>:<level> gst-launch --gst-debug=videotestsrc:5 videotestsrc ! filesink location=/dev/null
View full article
1. To setup the Yocto environment, from the BASE folder run fsl-community-bsp $ . setup-environment build 2. Build the toolchain build $ bitbake meta-toolchain # Other toolchains: # Qt Embedded toolchain build: bitbake meta-toolchain-qte # Qt X11 toolchain build: bitbake meta-toolchain-qt 3. Install it on your PC build $ sudo sh \   tmp/deploy/sdk/poky-eglibc-x86_64-arm-toolchain-<version>.sh 4. Setup the toolchain environment build $ source \   /opt/poky/<version>/environment-setup-armv7a-vfp-neon-poky-linux-gnueabi 5. Get the Linux Kernel's source code. $ git clone git://git.freescale.com/imx/linux-2.6-imx.git linux-imx $ cd linux-imx 6. Create a local branch linux-imx $ BRANCH=imx_3.0.35_4.0.0 # Change to any branch you want,   # Use 'git branch -a' to list all linux-imx $ git checkout -b ${BRANCH} origin/${BRANCH} 7. Export ARCH and CROSS_COMPILE linux-imx $ export ARCH=arm  linux-imx $ export CROSS_COMPILE=arm-poky-linux-gnueabi- linux-imx $ unset LDFLAGS 8. Choose configuration and compile linux-imx $ make imx6_defconfig  linux-imx $ make uImage  9. To Test your changes, copy the `uImage` into your SD Card linux-imx $ sudo cp arch/arm/boot/uImage /media/boot 10. If case you want your changes to be reflected on your Yocto Framework, create the patches following the document i.MX Yocto Project: How can I patch the kernel?
View full article
Check new updated version for with Morty here Step 1 : Get iMX Yocto AVS setup environment Review the steps under Chapter 3 of the i.MX_Yocto_Project_User'sGuide.pdf on the L4.X LINUX_DOCS to prepare your host machine. Including at least the following essential Yocto packages $ sudo apt-get install gawk wget git-core diffstat unzip texinfo \   gcc-multilib build-essential chrpath socat libsdl1.2-dev u-boot-tools Install the i.MX NXP AVS repo Create/Move to a directory where you want to install the AVS yocto build enviroment. Let's call this as <yocto_dir> $ cd <yocto_dir> $ repo init -u https://source.codeaurora.org/external/imxsupport/meta-avs-demos -b master -m imx7d-pico-avs-sdk_4.1.15-1.0.0.xml Download the AVS BSP build environment: $ repo sync Step 2: Setup yocto for Alexa_SDK image with AVS-SETUP-DEMO script: Run the avs-setup-demo script as follows to setup your environment for the imx7d-pico board: $ MACHINE=imx7d-pico DISTRO=fsl-imx-x11 source avs-setup-demo.sh -b <build_sdk> Where <build_sdk> is the name you will give to your build folder. After acepting the EULA the script will prompt if you want to enable: Sound Card selection The following Sound Cards are supported on the build: SGTL (In-board Audio Codec for PicoPi) 2-Mic Conexant The script will prompt if you are going to use the Conexant Card. If not then SGTL will be assumed as your selection Are you going to use Conexant Sound Card [Y/N]? Install Alexa SDK Next option is to select if you want to pre-install the AVS SDK software on the image. Do you want to build/include the AVS_SDK package on this image(Y/N)? If you select YES, then your image will contain the AVS SDK ready to use (after authentication). Note this AVS_SDK will not have WakeWord detection support, but it can be added on runtime. If your selection was NO, then you can always manually fetch and build the AVS_SDK on runtime. All the packages dependencies will be already there, so only fetching the AVS_SDK source code and building it is required. Finish avs-image configuration At the end you will see a text according with the configuration you select for your image build. Next is an example for a Preinstalled AVS_SDK with Conxant Sound Card support and WiFi/BT not enabled. ==========================================================   AVS configuration is now ready at conf/local.conf             - Sound Card = Conexant                                     - AVS_SDK pre-installed                                       You are ready to bitbake your AVS demo image now:               bitbake avs-image                                        ========================================================== Step 3: Build the AVS image Go to your <build_sdk> directory and start the build of the avs-image There are 2 options Regular Build: $ cd <yocto_dir>/<build_sdk> $ bitbake avs-image With QT5 support included: $ cd <yocto_dir>/<build_sdk> $ bitbake avs-image-qt5 The image with QT5 is useful if you want to add some GUI for example to render DisplayCards. Step 4 : Deploying the built images to SD/MMC card to boot on target board. After a build has succesfully completed, the created image resides at <build_sdk>/tmp/deploy/images/imx7d-pico/ In this directory, you will find the imx7d-pico-avs.sdcard image or imx7d-pico-avs-qt5.sdcard, depending on the build you chose on Step3. To Flash the .sdcard image into the eMMC device of your PicoPi board follow the next steps: Download the bootbomb flasher Follow the instruction on Section 4. Board Reflashing of the Quick Start Guide for AVS kit to setup your board on flashing mode. Copy the built SDCARD file $ sudo dd if=imx7d-pico-avs.sdcard of=/dev/sd bs=1M && sync $ sync Properly eject the pico-imx7d board: $ sudo eject /dev/sd NXP Documentation Refer to the Quick Start Quide for AVS SDK to fully setup your PicoPi board with Synaptics 2Mic and PicoPi i.mx7D For a more comprehensive understanding of Yocto, its features and setup; more image build and deployment options and customization, please take a look at the i.MX_Yocto_Project_User's_Guide.pdf document from the Linux documents bundle mentioned at the beginning of this document. For a more detailed description of the Linux BSP, u-boot use and configuration, please take a look at the i.MX_Linux_User's_Guide.pdf document from the Linux documents bundle mentioned at the beginning of this document.
View full article
Basic Linear Algebra Subprograms (BLAS) is a specification that prescribes a set of low-level routines for performing common linear algebra operations such as vector addition, scalar multiplication, dot products, linear combinations, and matrix multiplication. OpenBLAS is an optimized BLAS library which is uesd for deep learning accelerator in Caffe/Caffe2. I enable it in Yocto (Rocko) by adding bb file. And I build on i.MX6QP, i.MX7ULP and i.MX8MQ and also run its test example successfully. You can find test example(openblas_utest) under folder image/opt/openblas/bin of OpenBLAS work directory. Currently, version 0.3.0 is supported in the bb file. +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ update to v 0.3.6 and enable mutli-thread by set USE_OPENMP=1 and USE_THREAD=4 when compiling this library.
View full article
[中文翻译版] 见附件   原文链接: https://community.nxp.com/docs/DOC-341641 
View full article
In defaut Linux BSP, NXP implemented LVDS to HDMI(it6263) and MIPI-DSI to HDMI(adv7535) bridge chip drivers. And these drivers need read the EDID from display, then apply the timing parameters to DRM driver. But for the use case that bridge chip -> Serializer -> Deserializer -> LCD Panel use case, there is no EDID. The attached are reference patches for such use case, it combined the bridge chip to panel directly, and no EDID is needed. The patches are tested on iMX8QXP MEK with bridge chip + panel mode, both of them can see the fb0 device under /sys/class/graphics/ folder, also can see card under  /sys/class/drm/. Display works fine with DTS selected 720P panel mode. [2020-06-24]: Add patches for L4.14.98 kernel: Android_Auto_P9.0.0_GA2.1.0_Kernel_No_EDID_IT6263.patch L4.14.98-iMX8QXP-MEK-ADV7535-MIPI-DSI-to-HDMI-bridge-chip-com.patch
View full article
When working with IPU applications, sometimes image format converter is needed to check images generated by IPU that are not readable by PC (e.g. RGB565, common i.MX framebuffer format -> png or jpg) or generate a RGB picture from an encoded file to be read by IPU (e.g. png -> RGB565 framebuffer). There are some useful tools on Linux and some also available on Windows that can perform these conversions. I listed 5 tools with some usage examples below. IMAGEMAGICK // Display a 800x600 rgb image display -size 800x600 -depth 8 rgb:output.rgb // Show information of output.rgb identify -size 1296x972 -depth 8 output.rgb // Convert a 640x480 grayscale raw rgb file to png convert -size 640x480 -depth 8 imagefile.rgb image.png // To list all available color formats identify -list format For more information about Imagemagick and its format support. access: http://www.imagemagick.org/script/formats.php FFMPEG // List available formats for ffmpeg ffmpeg -pix_fmts // Convert raw rgb565 image to png ffmpeg -vcodec rawvideo -f rawvideo -pix_fmt rgb565 -s 1024x768 -i freescale_1024x768.raw -f image2 -vcodec png screen.png // Convert png to raw rgb565 ffmpeg -vcodec png -i image.png -vcodec rawvideo -f rawvideo -pix_fmt rgb565 image.raw // Convert a 720x480 NV12 (YUV 420 semi-planar) image to png ffmpeg -s 720x480 -pix_fmt nv12 -i image-nv12.yuv -f image2 -pix_fmt rgb24 image-png.png // Convert a 640x480 uyvy422 image to png ffmpeg -s 640x480 -pix_fmt uyvy422 -i image-uyvy422.yuv -f image2 -pix_fmt rgb24 image-uyvy422.png MENCODER http://www.mplayerhq.hu/DOCS/HTML/en/encoding-guide.html TRANSCODING http://www.transcoding.org/cgi-bin/transcode?Examples GRAPHICSMAGICK http://www.graphicsmagick.org/
View full article
Freescale does not have a specific GStreamer element to do JPEG encoding, so the standard 'jpegenc' should be used. Image Capture With a web camera gst-launch v4l2src num-buffers=1 ! jpegenc ! filesink location=sample.jpeg With an embedded camera gst-launch mfw_v4lsrc num-buffers=1 !  jpegenc ! filesink location=sample.jpeg More pipelines on GStreamer i.MX6 Pipelines
View full article
Hardware connection: there are two board-to-board connectors on E-INK daughter card IMXEBOOKDC4, while there is only one on i.MX7D Sabre board, as the picture below. This might be a bit confusing to connect the two: Checked with internal, the original design was trying to wire both eLCDIF and EPDC bus out to one daughter card, add the flexibility to have different configurations on one display daughter card(LCD/EPD). On i.MX7D Sabre board, only one connector is available for EPDC bus. Here is how we connect i.MX7D Sabre and IMXEBOOKDC4: Software setup: here we use pre-build L3.14.38_6UL7D_Beta Linux as our boot-image, steps to setup/boot/test EPDC: 1. download and decompress BSP pre-build image package "L3.14.38_beta_images_MX6UL7D.tar.gz", you should be able to find the SD image in it -- "fsl-image-gui-x11-imx7dsabresd.sdcard" 2. program the SD image on your SD card(>800 MBytes) with command(I'm running this in Ubuntu): "dd if=fsl-image-gui-x11-imx7dsabresd.sdcard of=/dev/sdb;sync" 3. insert SD card to the slot(J6) on i.MX7D Sabre board, connect debug-UART and power-on the board 4. modify the u-boot environment variables as below:      a.) setenv fdt_file imx7d-sdb-epdc.dtb           (originally this is "fdt_file=imx7d-sdb.dtb")      b.) setenv mmcargs 'setenv bootargs console=${console},${baudrate} root=${mmcroot} epdc video=mxcepdcfb:E060SCM,bpp=16'           (originally this is "mmcargs=setenv bootargs console=${console},${baudrate} root=${mmcroot}") 5. boot into Linux kernel, run unit-test: "/unit_tests/mxc_epdc_fb_test.out", should be able to have test patterns running on EPD.
View full article
The Linux L4.9.51 and SDKv2.3 for i.MX 8MQuad(mScale850D) RFP(GA) release files are now available. Linux on IMX_SW web page, Overview -> BSP Updates and Releases ->Linux L4.9.51 for i.MX 8MQuad GA. SDK on https://mcuxpresso.nxp.com/ web page.   Files available: Linux: # Name Description 1 fsl-yocto-L4.9.51_mx8mq-ga.tar.gz L4.9.51 i.MX 8MQuad GA Linux BSP Documentation. Includes Release Notes, User Guide. 2 L4.9.51-ga_images_mx8mq.tar.gz Linux Binary Demo files for i.MX 8MQuad EVK 3 L4.9.51_8mq-ga_mfg-tools.tar.gz Manufacturing Toolkit for Linux L4.9.51 i.MX8MQuad GA 4 L4.9.51_8mq-ga_gpu-tools.tar.gz VivanteVTK file for L4.9.51 i.MX8MQuad GA 5 imx-aacpcodec-4.3.4.tar.gz AAC Plus Codec for L4.9.51 of iMX 8MQuad GA   SDK:   On https://mcuxpresso.nxp.com/, click the Select Development Board to customize the SDK based on your configuration then download the SDK package. CMSIS pack is also supported.   Target board: i.MX 8MQuad EVK   What’s New/Features: Please consult the Release Notes.   Known issues For known issues and more details please consult the Release Notes.   More information on changes of Yocto, see: README: https://source.codeaurora.org/external/imx/imx-manifest/tree/README?h=imx-linux-morty ChangeLog: https://source.codeaurora.org/external/imx/imx-manifest/tree/ChangeLog?h=imx-linux-morty  
View full article
OpenCV is a computer vision library originally developed by Intel. It is free for commercial and research use under the open source BSD license. The library is cross-platform. It focuses mainly on real-time image processing; as such, if it finds Intel's Integrated Performance Primitives on the system, it will use these commercial optimized routines to accelerate itself. Application OpenCV's application areas include: * 2D and 3D feature toolkits * Egomotion estimation * Face Recognition * Gesture Recognition * Human-Computer Interface (HCI) * Mobile robotics * Motion Understanding * Object Identification * Segmentation and Recognition * Stereopsis Stereo vision: depth perception from 2 cameras * Structure from motion (SFM) * Motion Tracking To support some of the above areas, OpenCV includes a statistical machine learning library that contains: * Boosting * Decision Trees * Expectation Maximization * k-nearest neighbor algorithm * Naive Bayes classifier * Artificial neural networks * Random forest * Support Vector Machine Installing OpenCV on i.MX 51 EVK Board running Ubuntu Linux Assuming that you already have the Ubuntu Linux running on your board, you can use this wiki page to guide you to get your USB camera running on your system in order to use real time image processing features of this library. In a brand new installation of Ubuntu some libraries is not installed by default, so you need to install them by your own hands (use synaptic to do that), here is the list of these libraries: libgtk2.0-dev libjpeg62-dev zlib1g-dev libpng12-dev libtiff4-dev libjasper-dev libgst-dev libgstreamer0.10-dev If you already have some of those libraries installed, make sure that is the DEV version. After installing those libraries you can download the stable OpenCV version here. Install it following the procedure below: 1 - untar the opencv package tar -xvzf opencv-1.1pre1.tar.gz  2 - change to OpenCV folder cd opencv-1.1.0  3 - configure the installation enabling gstreamer and letting to compile demo apps later ./configure --with-gstreamer --disable-apps You will get the following results: General configuration ================================================       Compiler:                         g++       CXXFLAGS:       DEF_CXXFLAGS:             -Wall -fno-rtti -pipe -O3 -fomit-frame-pointer       PY_CXXFLAGS:               -Wall -pipe -O3 -fomit-frame-pointer       OCT_CXXFLAGS:             -fno-strict-aliasing -Wall -Wno-uninitialized -pipe -O3 -fomit-frame-pointer        Install path:                      /usr/local  HighGUI configuration ================================================       Windowing system --------------       Use Carbon / Mac OS X:        no       Use gtk+ 2.x:                        yes       Use gthread:                         yes       Image I/O ---------------------       Use ImageIO / Mac OS X:       no       Use libjpeg:                            yes       Use zlib:                                yes       Use libpng:                             yes       Use libtiff:                               yes       Use libjasper:                          yes       Use libIlmImf:                          no             Video I/O ---------------------       Use QuickTime / Mac OS X:     no       Use xine:                                no       Use gstreamer:                        yes       Use ffmpeg:                             no       Use dc1394 & raw1394:     no       Use v4l:                                   yes       Use v4l2:                                 yes       Use unicap:                             no     Wrappers for other languages =========================================       SWIG Python                          no       Octave                                    no       Additional build settings ============================================       Build demo apps                      no Now run make ... 4 - Build OpenCV ./make 5 - Install OpenCV ./sudo make install if all steps above were executed properly, now you can compile the sample applications: 1 - change to samples/c directory cd samples/c 2 - change the build_all script mode to +x chmod +x build_all.sh 3 - run the script ./build_all.sh Now you can test. The results below were taken from the Laplacian filter sample processing in real-time images grabbed from an USB camera: Laplacian filter with USB Camera capture device Also, you can see how is it performance on a 3 windowed application performing color conversion and canny edge detection at the same time: http://www.youtube.com/watch?v=w9yQgdABT7c EOF !
View full article
Using a USB Touchscreen on Ubuntu   This example uses a XENARC 706TSA monitor http://www.xenarc.com/product/706tsa.html To use a USB touchscreen on i.MX51 EVK, disable all touchscreen drivers on menuconfig and build the kernel: Device Drivers  --->        Input device support  --->        [ ]   Touchscreens  ---> Download xserver-xorg-input-evtouch (0.8.8-ubuntu3 version) from http://launchpadlibrarian.net/24760784/xserver-xorg-input-evtouch_0.8.8-0ubuntu3_armel.deb. X crash is found if using latest 0.8.8-ubuntu6.1 version. For the details. See https://bugs.launchpad.net/ubuntu/+source/xf86-inputevtouch/+bug/511491 On MX51 EVK board, run “sudo dpkg –i xserver-xorg-input-evtouch_0.8.8-0ubuntu3_armel.deb” to install debian package. Add fdi file by "sudo vi ./usr/share/hal/fdi/policy/20thirdparty/50-eGalax.fdi": <?xml version="1.0" encoding="UTF-8"?> <deviceinfo version="0.2">    <device>       <match key="info.product" contains="eGalax">          <match key="info.capabilities" contains="input">             <merge key="input.x11_driver" type="string">evtouch</merge>             <merge key="input.x11_options.minx" type="string">130</merge>             <merge key="input.x11_options.miny" type="string">197</merge>             <merge key="input.x11_options.maxx" type="string">3945</merge>             <merge key="input.x11_options.maxy" type="string">3894</merge>             <merge key="input.x11_options.Rotate" type="string">CCW</merge>             <merge key="input.x11_options.Swapy" type="string">true</merge>             <merge key="input.x11_options.taptimer" type="string">30</merge>             <merge key="input.x11_options.longtouchtimer" type="string">750</merge>             <merge key="input.x11_options.longtouched_action" type="string">click</merge>             <merge key="input.x11_options.longtouched_button" type="string">3</merge>             <merge key="input.x11_options.oneandhalftap_button" type="string">2</merge>             <merge key="input.x11_options.movelimit" type="string">10</merge>             <merge key="input.x11_options.touched_drag" type="string">1</merge>             <merge key="input.x11_options.maybetapped_action" type="string">click</merge>             <merge key="input.x11_options.maybetapped_button" type="string">1</merge>          </match>       </match>    </device> </deviceinfo> Save above configuration. Calibrating Calibration in made by clicking on System -> Administration -> Calibrate Touchscreen Follow the on screen instructions and reboot the system. Calibrating using Xinput Calibrator Xinput_calibrator is another option to calibrate touchscreen. It can be downloaded at: http://www.freedesktop.org/wiki/Software/xinput_calibrator On i.MX5x Ubuntu, unpack the source code: tar -xzvf xinput_calibrator-0.7.5.tar.gz Install xorg-dev, it's required to build xinput_calibrator sudo apt-get install xorg-dev Configure, build and install xinput_calibrator ./configure ./make ./make install Execute xinput_calibrator. A four-point calibration screen will be shown. Follow the instructions on screen and after complete xinput_calibrator will return the calibration parameters. Replace the given calibration parameters on file /usr/share/hal/fdi/policy/20thirdparty/50-eGalax.fdi and reboot the system.
View full article
[中文翻译版] 见附件   原文链接: https://community.nxp.com/docs/DOC-341570 
View full article
(DEPRECATED. Please check this document for Real Time Streaming) A server can be streaming video and a client, in this case a i.MX6 target, is receiving and decoding it. For example, a server with GStreamer and a web camera connected, can be streaming with the following command: $ # Pipeline 1 $ gst-launch v4l2src ! 'video/x-raw-yuv, format=(fourcc)I420, width=(int)1280, height=(int)800' ! ffenc_mpeg4 ! tcpserversink host=$CLIENT_IP port=$PORT and on the target, the client receives, decodes and display with $ # Pipeline 2 $ gst-launch tcpclientsrc host=$SERVER_IP port=$PORT  ! 'video/mpeg, width=(int)1280, height=(int)800, framerate=(fraction)10/1, mpegversion=(int)4, systemstream=(boolean)false' ! vpudec ! mfw_isink The filter caps between the tcpclientsrc and the decoder (vpudec) depend on the sink caps coming from the server encoder (ffenc_mpeg4), so these may change depending on your needs. Running the above pipelines require the environment variables SERVER_IP, CLIENT_IP and PORT. In case you want the i.MX6 to act as a server, just change the video source (either mfw_v4lsrc of v4l2src) and the encoder (vpuenc), so $ # Pipeline 3 $  gst-launch v4l2src  !  'video/x-raw-yuv, format=(fourcc)I420, width=(int)640, height=(int)480, interlaced=(boolean)false, framerate=(fraction)10/1'  ! vpuenc ! tcpserversink host=$CLIENT_IP port=$PORT For testing purposes, set SERVER_IP=127.0.0.1, CLIENT_IP=127.0.0.1 and PORT=500, and run pipeline 3 and 2 in two different consoles. Check with 'top' the  CPU usage and see that VPU is actually doing most of the work.
View full article