i.MX Processors Knowledge Base

cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 

i.MX Processors Knowledge Base

Discussions

Sort by:
Description       this doc is explain how to develop a audio card driver base on i.MX6 platform. which explain the ASOC architecture struction basic knowledage and then give some sample for the audio driver development like: 1:NXP SGTL5000: NXP i.MX BSP sabrelite board default support it. 2: Wolfson WM8524.    A: 3.0.35 BSP support: i.MX6 setbox BSP support it:(which in elder fsl community link and out of data)    B: 3.14.28 BSP support pls check attachment: 3: Wolfson WM8960.     which include how to add the android middle-layer and driver, pls check attachment. 4: TI TLV320AIC3120      which include how to add the android middle-layer and driver, pls check attachment. 5: TI TLV320AIC3X   Products Product Category NXP Part Number URL MPU i.MX6 Family https://www.nxp.com/products/processors-and-microcontrollers/arm-processors/i-mx-applications-processors/i-mx-6-processors:IMX6X_SERIES   Tools NXP Development Board URL i.MX6 SabreSDP https://www.nxp.com/design/development-boards:EVDEBRDSSYS#/collection=softwaretools&start=0&max=25&query=typeTax%3E%3Et633::archived%3E%3E0::Sub_Asset_Type%3E%3ETSP::deviceTax%3E%3Ec731_c380_c127_c126&sorting=Buy%2FSpecifications.desc&language=en&siblings=false which have a doc MX6X_ASOC_V5-20191115.pdf and related driver sample codes.
View full article
///////////////////////////create device node /dev/galcore///////////////////////////// $home/myandroid/kernel_imx/drivers/mxc/gpu-viv/Kbuild MODULE_NAME ?= galcore /* define node name*/ $home/myandroid/kernel_imx/drivers/mxc/gpu-viv/hal/os/linux/kernel/gc_hal_kernel_linux.h define DEVICE_NAME "galcore" $home/myandroid/kernel_imx/drivers/mxc/gpu-viv/hal/os/linux/kernel/gc_hal_kernel_probe.c drv_init call ret = register_chrdev(major, DEVICE_NAME, &driver_fops); ///////////////////////////////opengles2 functios/////////////////////////////////////////// myandroid/device/fsl-proprietary/gpu-viv/lib/egl/libGLESv2_VIVANTE.so glActiveTexture glBindBuffer ... ... ... //those glxxxxxx call into sub_D40C int __fastcall sub_D40C(int a1, int a2, int a3) //address 0x0000D40C { int result; // r0@1 int v4; int v5; v4 = a2;   v5 = a3;   gcoOS_GetTLS(&v4);  //------------> goto libGAL.so   result = v4;   if ( v4 )     result = *(_DWORD *)(v4 + 36);   return result; } and $home/myandroid/device/fsl-proprietary/gpu-viv/lib/libGAL.so //export function signed int __fastcall gcoOS_GetTLS(void **a1) { ... ... gcoOS_GetTLS v4 = open("/dev/galcore", 2); ... ... } and device node /dev/galcore pass command into module galcore $home/myandroid/kernel_imx/drivers/mxc/gpu-viv/hal/kernel/gc_hal_kernel.c gckKERNEL_Dispatch This document was generated from the following discussion: Share Vivante 3d gc2000 work flow
View full article
When streaming, if you want to play a streaming URL, it can be inconvenient if the browser cannot recognize the URL as a media stream and downloads the content rather than using Gallery to play it. To create this kind of media streaming, you need to write an apk to use VideoView to play the URL/media stream from the console. Here is the command of how to play a media file or network stream from console. Gingerbread am start -n com.cooliris.media/com.cooliris.media.MovieView -d "<URL>"       The URL can be file position or network stream URL, such as: you can play a local file by: am start -n com.cooliris.media/com.cooliris.media.MovieView -d "/mnt/sdcard/test.mp4" You can also play a http stream by: am start -n com.cooliris.media/com.cooliris.media.MovieView -d "http://v.iask.com/v_play_ipad.php?vid=76710932" Or play a rtsp stream by: am start -n com.cooliris.media/com.cooliris.media.MovieView -d "rtsp://10.0.2.1:554/stream" ICS am start -n com.android.gallery3d/com.android.gallery3d.app.MovieActivity -d "<URL>"        The URL has the same definition of Gingerbread.
View full article
The following document contains a list of document, questions and discussions that are relevant in the community based on amount of views. If you are having a problem, doubt or getting started in i.MX processors, you should check the following links to see if your doubt is in there. Yocto Project Freescale Yocto Project main page‌ Yocto Training - HOME‌ i.MX Yocto Project: Frequently Asked Questions‌ Useful bitbake commands‌ Yocto Project Package Management - smart  How to add a new layer and a new recipe in Yocto  Setting up the Eclipse IDE for Yocto Application Development Guide to the .sdcard format  Yocto NFS &amp; TFTP boot  YOCTO project clean  Yocto with a package manager (ex: apt-get)  Yocto Setting the Default Ethernet address and disable DHCP on boot.  i.MX x Building QT for i.MX6  i.MX6/7 DDR Stress Test Tool V3.00  i.MX6DQSDL DDR3 Script Aid  Installing Ubuntu Rootfs on NXP i.MX6 boards  iMX6DQ MAX9286 MIPI CSI2 720P camera surround view solution for Linux BSP i.MX Design&amp;Tool Lists  Simple GPIO Example - quandry  i.MX6 GStreamer-imx Plugins - Tutorial &amp; Example Pipelines  Streaming USB Webcam over Network  Step-by-step: How to setup TI Wilink (WL18xx) with iMX6 Linux 3.10.53  Linux / Kernel Copying Files Between Windows and Linux using PuTTY  Building Linux Kernel  Patch to support uboot logo keep from uboot to kernel for NXP Linux and Android BSP (HDMI, LCD and LVDS)  load kernel from SD card in U-boot  Changing the Kernel configuration for i.MX6 SABRE  Android  The Android Booting process  What is inside the init.rc and what is it used for.  Others How to use qtmultimedia(QML) with Gstreamer 1.0
View full article
BSP version: 11.03 Multimedia Package version: 11.03 1. Install BSP and Multi Media package (11.03 release) 2. Avoid Display Timeout: append the following line to rootfs/etc/oprofile: echo -e -n '\033[9]' > /dev/tty0 3. Set VGA port as the primary display in the kernel command line: video=mxcdi1fb:GBR24,VGA-XGA di1_primary vga 4. Connect a VGA monitor and WVGA display to the MX53 Quick Start 5. Boot Linux on MX53 Quick Start board (NFS is used in this example) 6. Unblank WVGA display (fb1): $ echo 0 > /sys/class/graphics/fb1/blank 7. On the target enter into /dev/shm directory. If the following files are present: vss_lock vss_shmem ,delete them. 8. On your host, edit the ltib/rootfs/usr/share/vssconfig as following: vss device definition Master=VGA, Slave=WVGA master display [VGA] type = framebuffer format = RGBP fb_num = 2 main_fb_num = 0 vs_max = 4 slave display [WVGA] type = framebuffer format = RGBP fb_num = 1 vs_max = 4 9. Run the Gstreamer pipeline below: gst-launch filesrc location=file.mp4 ! qtdemux ! mfw_vpudecoder ! mfw_isink display=VGA display-1=WVGA Video is played on the VGA and WVGA panels. A 720p file can be played at the same time.
View full article
Video decoding gst-launch filesrc location=sample.mp4 ! qtdemux ! ffdec_h264 ! mfw_v4lsink Notes: On LTIB BSP 3.0.35_4.0.0, prep the package and apply the attached patch on top, then build. On Yocto, the easy way to add the gst-ffmpeg package is by adding these two lines on the conf/local.conf file: IMAGE_INSTALL_append = " gst-ffmpeg" LICENSE_FLAGS_WHITELIST = 'commercial'
View full article
[中文翻译版] 见附件   原文链接: https://community.nxp.com/docs/DOC-342059 
View full article
OpenCV is a computer vision library originally developed by Intel. It is free for commercial and research use under the open source BSD license. The library is cross-platform. It focuses mainly on real-time image processing; as such, if it finds Intel's Integrated Performance Primitives on the system, it will use these commercial optimized routines to accelerate itself. Application OpenCV's application areas include: * 2D and 3D feature toolkits * Egomotion estimation * Face Recognition * Gesture Recognition * Human-Computer Interface (HCI) * Mobile robotics * Motion Understanding * Object Identification * Segmentation and Recognition * Stereopsis Stereo vision: depth perception from 2 cameras * Structure from motion (SFM) * Motion Tracking To support some of the above areas, OpenCV includes a statistical machine learning library that contains: * Boosting * Decision Trees * Expectation Maximization * k-nearest neighbor algorithm * Naive Bayes classifier * Artificial neural networks * Random forest * Support Vector Machine Installing OpenCV on i.MX 51 EVK Board running Ubuntu Linux Assuming that you already have the Ubuntu Linux running on your board, you can use this wiki page to guide you to get your USB camera running on your system in order to use real time image processing features of this library. In a brand new installation of Ubuntu some libraries is not installed by default, so you need to install them by your own hands (use synaptic to do that), here is the list of these libraries: libgtk2.0-dev libjpeg62-dev zlib1g-dev libpng12-dev libtiff4-dev libjasper-dev libgst-dev libgstreamer0.10-dev If you already have some of those libraries installed, make sure that is the DEV version. After installing those libraries you can download the stable OpenCV version here. Install it following the procedure below: 1 - untar the opencv package tar -xvzf opencv-1.1pre1.tar.gz  2 - change to OpenCV folder cd opencv-1.1.0  3 - configure the installation enabling gstreamer and letting to compile demo apps later ./configure --with-gstreamer --disable-apps You will get the following results: General configuration ================================================       Compiler:                         g++       CXXFLAGS:       DEF_CXXFLAGS:             -Wall -fno-rtti -pipe -O3 -fomit-frame-pointer       PY_CXXFLAGS:               -Wall -pipe -O3 -fomit-frame-pointer       OCT_CXXFLAGS:             -fno-strict-aliasing -Wall -Wno-uninitialized -pipe -O3 -fomit-frame-pointer        Install path:                      /usr/local  HighGUI configuration ================================================       Windowing system --------------       Use Carbon / Mac OS X:        no       Use gtk+ 2.x:                        yes       Use gthread:                         yes       Image I/O ---------------------       Use ImageIO / Mac OS X:       no       Use libjpeg:                            yes       Use zlib:                                yes       Use libpng:                             yes       Use libtiff:                               yes       Use libjasper:                          yes       Use libIlmImf:                          no             Video I/O ---------------------       Use QuickTime / Mac OS X:     no       Use xine:                                no       Use gstreamer:                        yes       Use ffmpeg:                             no       Use dc1394 & raw1394:     no       Use v4l:                                   yes       Use v4l2:                                 yes       Use unicap:                             no     Wrappers for other languages =========================================       SWIG Python                          no       Octave                                    no       Additional build settings ============================================       Build demo apps                      no Now run make ... 4 - Build OpenCV ./make 5 - Install OpenCV ./sudo make install if all steps above were executed properly, now you can compile the sample applications: 1 - change to samples/c directory cd samples/c 2 - change the build_all script mode to +x chmod +x build_all.sh 3 - run the script ./build_all.sh Now you can test. The results below were taken from the Laplacian filter sample processing in real-time images grabbed from an USB camera: Laplacian filter with USB Camera capture device Also, you can see how is it performance on a 3 windowed application performing color conversion and canny edge detection at the same time: http://www.youtube.com/watch?v=w9yQgdABT7c EOF !
View full article
The document includes the following contents: (1)document how to port ov5646 to android jb4.2.2 (2) ov5645 driver for Linux 3.0.35 (3) ov5645 schematic based on i.MX6Q/DL (4)ov5645 for android camera HAL   [Note:]      P5V29A-0JG is a camera module based on OV5645, and PAO532-0JG is based on OV5640, both manufactured by NINGBO SUNNY OPOTECH CO.LTD (China), If customer wants to use them on i.MX6 platform, can send me email to ask for datasheets of P5V29A & PAO532 , or discuss corresponding questions on porting.   Email: weidong.sun@freescale.com
View full article
Hello, on this post I will explain how to record separated audio channels using an 8MIC-RPI-MX8 Board. As background about how to setup the board to record and play audio using i.MX boards, I suggest you take a look on the next post: How to configure, record and play audio using an 8MIC-RPI-MX8 Board. Requirements: I.MX 8M Mini EVK. Linux Binary Demo Files - i.MX 8MMini EVK. 8MIC-RPI-MX8 Board. Serial console emulator (Tera Term, Putty, etc.). Headphones/speakers. Waveform Audio Format WAV, known for WAVE (Waveform Audio File Format), is a subset of Microsoft’s Resource Interchange File Format (RIFF) specification for storing digital audio files. This format does not apply compression to the information and stores the audio with different sampling rates and bitrates. WAV files are larger in size compared to other formats such as MP3 which uses compression to reduce the file size while maintaining a good audio quality but, there is always some lose on quality since audio information is too random to be compressed with conventional methods, the main advantage of this format is provide an audio file without losses that is also widely used on studio. This files starts with a file header with data chunks. A WAV file consists of two sub-chunks: fmt chunk: data format. data chunk: sample data. So, is structured by a metadata that is called WAV file header and the actual audio information. The header of a WAV (RIFF) file is 44 bytes long and has the following format: How to separate the channels? To separate each audio channel from the recording we need to use the next command that will record raw data of each channel. arecord -D plughw:<audio device> -c<number of chanels> -f <format> -r <sample rate> -d <duration of the recording> --separate-channels <output file name>.wav arecord -D plughw:2,0 -c8 -f s16_le -r 48000 -d 10 --separate-channels sample.wav This command will output raw data of recorded channels as is showed below. This raw data cannot be used as a “normal” .wav file because the header information is missing. It is possible to confirm it if import raw data to a DAW and play recorded samples: So, to use this information we need to create the header for each file using WAVE library on python. Here the script that I used: import wave import os name = input("Enter the name of the audio file: ") os.system("arecord -D plughw:2,0 -c8 -f s16_le -r 48000 -d 10 --separate-channels " + name + ".wav") for i in range (0,8): with open(name + ".wav." + str(i), "rb") as in_file: data = in_file.read() with wave.open(name + "_channel_" + str(i) +".wav", "wb") as out_file: out_file.setnchannels(1) out_file.setsampwidth(2) out_file.setframerate(48000) out_file.writeframesraw(data) os.system("mkdir output_files") os.system("mv " + name + "_channel_" + "* " + "output_files") os.system("rm " + name + ".wav.*") If we run the script, will generate a directory with the eight audio channels in .wav format. Now, we will be able to play each channel individually using an audio player. References IBM, Microsoft Corporation. (1991). Multimedia Programming Interface and Data Specifications 1.0. Microsoft Corporation. (1994). New Multimedia Data Types and Data Techniques. Standford University. (2024, January 30). Retrieved from WAVE PCM sound file format: http://hummer.stanford.edu/sig/doc/classes/SoundHeader/WaveFormat/
View full article
One of the most popular use cases for embedded systems are projects destinated to show information and interact with users. These views are called GUI or Graphic User Interface which are designed to be intuitive, attractive, consistent, and clear. There are many tools that we can use to achieve great GUIs, mostly implemented for platforms such as Web, Android, and iOS. Here, we will need to introduce the concept of framework, basically, it is a set of tools and rules that provides a minimal structure to start with your development. Frameworks usually comes with configuration files, code snippets, files and folders organization helping us to save time and effort. Also, it is important to review the concept of SDK or Software Development Kit which is a set of tools that allows to build software for specific platforms. Usually supplies debugging tools, documentation, libraries, API’s, emulators, and sample code. Flutter is an open-source UI software development kit by Google that help us to create applications with great GUIs on different platforms from a single codebase. Depends on the reference, you can find Flutter defined as a framework or SDK and both are correct, however, an SDK could be a best definition thanks to Flutter supplies a wide and complete package to create an application in which framework is also included. This article is aimed at those that are in a prototyping stage looking for a different tool to develop projects. Also, this article pretends to be a theoretical introduction explaining the most important concepts. However, is a good practice to learn more about reviewing the official documentation from Flutter. (Flutter documentation | Flutter) Here is the structure used throughout this article: What is Flutter? Flutter details Platforms Programming language Official documentation Flutter for embedded systems What is Flutter? Flutter was officially released by Google in December 2018 with a main aim, to give developers a tool to create applications natively compiled for mobile (Android, iOS), web and desktop (Windows, Linux) from a single codebase. It means that as a developer, Flutter will create a structure with minimal code, configuration files, build files for each operating system, manifests, etc. in which we will add our custom code and finally build this code for our preferred OS. For example, we can create an application to review fruit and vegetable information and compile for Android and iOS with the same code. A basic Flutter development process based on my experience looks like the following diagram: Flutter has the following key features: Cross-platform development. Flutter allows the developer to create applications for different platforms using a single codebase. It means that you will not need to recreate the application for each platform you want to support.   Hot-reload. This feature allows the developer to see changes in real time without restarting the whole application, this results in time savings for your project.   High Performance Flutter apps achieve high performance due to the app code is compiled to native ARM code. With this tool no interpreters are involved.   UI Widgets Flutter supplies a set of widgets (UI components such as boxes, inputs text, buttons, etc.) predefined by UI systems guidelines Material on Android and Cupertino for iOS. Source: Material 3 Design Kit | Figma Community Source: Design - Apple Developer   Great community support. This feature could be subjective but, it is useful when we are developing our project find solutions to known issues or report new ones. Because of Flutter is an open source and is widely implemented in the industry this tool owns a big community, with events, forums, and documentation. Flutter Details Supported Platforms With Flutter you can create applications for: Android iOS Linux Debian Linux Ubuntu macOS web Chrome, Firefox, Safari, Edge Windows Supported deployment platforms | Flutter Programming Language Flutter use Dart, a programming language is an open-source language supported by Google optimized to use on the creation of user interfaces. Dart key features: Statically typed. This feature helps catching errors making the code robust ensuring that the variable’s value always match with the declared variable’s type. Null safety. All variables on Dart are non-nullable which means that every variable must have a non-null value avoiding errors at execution time. This feature also, make the code robust and secure. Async/Await. Dart is client-optimized which means that this language was specially created to ensure the best performance as a client application. Async/Await is a feature part of this optimization making easier to manage network requests and other asynchronous operations. Object oriented. Dart is an object-oriented language with classes and mixin. This is especially useful to use on Flutter with the usage of widgets. Compiler support of Just-In-Time (JIT) and Ahead-of-Time (AOT) JIT provides the support that enables the Hot Reload Flutter feature that I mentioned before. It is a complex mechanism, but Dart “detects” changes in your code and execute only these changes avoiding recompiling all the code. AOT compiler produces efficient ARM code improving start up time and performance. Official documentation Flutter has a rich community and documentation that goes from UI guidelines to an Architectural Overview. You can find the official documentation at the following links: Flutter Official Documentation: Flutter documentation | Flutter Flutter Community: Community (flutter.dev) Dart Official Documentation: Dart documentation | Dart Flutter for embedded systems So far, we know all the excellent features and platforms that Flutter can support. But, what about the embedded systems? On the official documentation we can find that Flutter may be used for embedded systems but in fact there is no an official supported platform. This SDK has been supported by their community, specially there is one repository on GitHub supported by Sony that provides documentation and Yocto recipes to support Flutter on embedded Linux. To understand the reason to differentiate between Flutter for Linux Desktop with official support and to create a specific Flutter support for embedded Linux is important to describe the basics of Flutter architecture. Based on the Flutter documentation the system is designed using layers that can be illustrated as follows:   Source: Flutter architectural overview | Flutter We can see as a top level “Framework” which is a high-level layer that includes widgets, tools and libraries that are in contact with developers. Below “Framework,” the layer “Engine” is responsible of drawing the widgets specified in the previous layer and provides the connection between high-level and low-level code. This layer is mostly written in C++ for this reason Flutter can achieve high performance running applications. Specifically for graphics rendering Flutter implements Impeller for iOS and Skia for the rest of platforms. The bottom layer is “Embedder” which is specific for each target and operating system this layer allows Flutter application to run as a native app providing the access to interact with different services managed by the operating systems such as input, rendering surfaces and accessibility. This layer for Linux Desktop uses GTK/GDK and X11 as backend that is highly dependent of unnecessary libraries and expensive for embedded systems which have constrained resources for computation and memory. The work around founded by Sony’s Flutter for Embedded Linux repository is to change this backend using a widely implemented backend for embedded systems Wayland. The following image illustrates the difference between Flutter for Linux Desktop and Flutter for Embedded Linux.   Source: What's the difference between Linux desktop and Embedded Linux · sony/flutter-embedded-linux Wiki · GitHub   Source: What's the difference between Linux desktop and Embedded Linux · sony/flutter-embedded-linux Wiki · GitHub Here is the link to the mentioned repository: GitHub - sony/flutter-elinux: Flutter tools for embedded Linux (eLinux) Finally, I would like to encourage you to read the official Flutter documentation and consider this tool as a great option compared to widely used tools on embedded devices such as Qt or Chromium. Also, please have a look to a great article written by Payam Zahedi delving into the implementation of Flutter for Embedded Linux measuring performance and giving conclusions about the usage of Flutter in embedded systems. (Flutter on Embedded Devices. Learn how to run Flutter on embedded… | by Payam Zahedi | Snapp Embedded | Medium).    
View full article
vpuwraper can fulfill VPU decoder/encoder, if customer’s user case is simple, for example they just need to encode yuv stream to H264, or decode H264 stream to yuv, There is no need to use gstreamer or V4L2 complex framework, you can use vpuwraper. Platform: i.MX8MP + L5.4.70.2.3.0 Build Procedure: mkdir vpu cd vpu git clone https://github.com/nxp-imx/imx-vpuwrap   cd imx-vpuwrap/ git tag -l   git switch -c rel_imx_5.4.70_2.3.0   source ../../.././5.4.70.2.3.0/sdk/environment-setup-aarch64-poky-linux   make -f Makefile_8mp   Test on i.MX8MP EVK board Pls find attached test log for decode and encode If busChromaU in YUV file is null, you will failed to encode it,pls apply patch vpuwraper patch for L5.4.70.2.3.0.patch to fix t If YUV file is interleave format, you need to add add interleave parameter : -interleave 1 ./test_enc_arm_elinux -i test.yuv -o aaa.h264 -f 2 -w 176 -h 96 -interleave 1   Thanks, Lambert
View full article
i.MX8 series contains internal HiFi4 DSP. It is targeted for Audio related signal processing. SOF (Sound Open Firmware) is open source audio DSP firmware, driver and SDK. This document introduces basic theory about IIR/FIR digital filters, how to design IIR/FIR digital filters and the Equalizer filters implementation by SOF. After that, the document also describes how HiFi4 DSP MAC engine accelerate the EQ filters calculation.
View full article
On behalf of Gopise Yuan. A collection of several GST debugging tips and known-how. When you need to play onto a DRM layer/plane directly without going through compositor, kmssink should be a good choice: // kmssink, with scale and adjust alpha property (opaque) and zpos (this requires kmssink>=1.16): gst-launch-1.0 filesrc location=/media/AVC-AAC-720P-3M_Alan.mov ! decodebin ! imxvideoconvert_g2d ! kmssink plane-id=37 render-rectangle="<100,100,720,480>" can-scale=false plane-properties=s,alpha=65535,zpos=2 When using playbin, you can still customize the pipeline besides the sink plugin, e.g. add a converter plugin: // Playbin with additional customization on converter before sink: gst-launch-1.0 playbin uri=file:///mnt/MP4_H264_AAC_1920x1080.mp4 video-sink="imxvideoconvert_g2d ! video/x-raw,format=BGRA,width=1920,height=1080 ! kmssink plane-id=44" GST can generate a pipeline graph for analyzing the pipeline in a intuitive manner: // Generate pipeline graph: 1. Export GST_DEBUG_DUMP_DOT_DIR=<dump-folder>, GST_DEBUG=4 2. Run pipeline with gst-launch or others. 3. Copy all dump files (.dot) from <dump-folder>. Note: one dump file will be created for each state transaction. Normally, what we need will be PAUSE_READY or READY_PAUSE, after which pipeline has been setup. 4. Convert the .dot file to PDF with Graphviz: dot -Tpdf 0.00.03.685443250-gst-launch.PAUSED_READY.dot > pipeline_PAUSED_READY.pdf  
View full article
This is based on L5.10.35 BSP where you have to install QT static build: Qt 5.15 static build: Assuming your sysroot is at "/sysroot-cross" and your toolchain is at "/Toolchain" your qt-source is at /Qt-5.15 PATH=/sysroot-cross/bin:/sysroot-cross/sbin:/Toolchain/bin mkdir /Qt-5.15/mkspecs/qws/linux-imx6-g++ create in this dir the textfile "qmake.conf" with this content: ####################### snip qmake.conf ############################## include(../../common/linux.conf) include(../../common/qws.conf) # modifications to g++.conf QMAKE_CC                = arm-linux-gnueabi-gcc QMAKE_CFLAGS            = -pipe -isystem /sysroot-cross/include -isystem /sysroot-cross/usr/include QMAKE_CXX               = arm-linux-gnueabi-g++ QMAKE_CXXFLAGS          = -pipe -isystem /sysroot-cross/include -isystem /sysroot-cross/usr/include QMAKE_INCDIR            = /sysroot-cross/include /sysroot-cross/usr/include QMAKE_LIBDIR            = /sysroot-cross/lib /sysroot-cross/usr/lib QMAKE_LINK              = arm-linux-gnueabi-g++ QMAKE_LINK_SHLIB        = arm-linux-gnueabi-g++ QMAKE_LFLAGS            = -L/sysroot-cross/lib -L/sysroot-cross/usr/lib -Wl,-rpath-link -Wl,/sysroot-cross/lib QMAKE_LFLAGS           += -Wl,-rpath-link -Wl,/sysroot-cross/usr/lib #Opengl QMAKE_INCDIR_OPENGL = /Vivante/include QMAKE_INCDIR_OPENGL += /Vivante/include/GL QMAKE_INCDIR_OPENGL += /Vivante/include/EGL QMAKE_INCDIR_OPENGL += /Vivante/include/GLES2 QMAKE_LIBDIR_OPENGL = /Vivante/lib QMAKE_INCDIR_OPENGL_ES1 = $$QMAKE_INCDIR_OPENGL QMAKE_LIBDIR_OPENGL_ES1 = $$QMAKE_LIBDIR_OPENGL QMAKE_INCDIR_OPENGL_ES1CL = $$QMAKE_INCDIR_OPENGL QMAKE_LIBDIR_OPENGL_ES1CL = $$QMAKE_LIBDIR_OPENGL QMAKE_INCDIR_OPENGL_ES2 = /Vivante/include QMAKE_INCDIR_OPENGL_ES2 += /Vivante/include/EGL QMAKE_INCDIR_OPENGL_ES2 += /Vivante/include/GLES2 QMAKE_LIBDIR_OPENGL_ES2 = $$QMAKE_LIBDIR_OPENGL QMAKE_INCDIR_EGL = $$QMAKE_INCDIR_OPENGL_ES2 QMAKE_LIBDIR_EGL = $$QMAKE_LIBDIR_OPENGL QMAKE_LIBS_EGL = -lEGL -lGAL -lGLESv2 -lGLES_CM QMAKE_LIBS_OPENGL_ES2 = -lEGL -lGAL -lGLESv2 -lGLES_CM QMAKE_LIBS_OPENGL = -lEGL -lGAL -lGLESv2 -lGLES_CM QMAKE_LIBS_OPENGL_QT = -lEGL -lGAL -lGLESv2 -lGLES_CM QMAKE_LIBS_OPENGL_ES1 = QMAKE_LIBS_OPENGL_ES1CL = # modifications to linux.conf QMAKE_AR                = arm-linux-gnueabi-ar cqs QMAKE_OBJCOPY           = arm-linux-gnueabi-objcopy QMAKE_STRIP             = arm-linux-gnueabi-strip QMAKE_CFLAGS_RELEASE   = -pipe -isystem /sysroot-cross/include -isystem /sysroot-cross/usr/include load(qt_config) ####################### snip qmake.conf ############################## create in the same dir the text file "qplatformdefs.h" ####################### snip qplatformdefs.h ############################## #include "../../linux-g++/qplatformdefs.h" ####################### snip qplatformdefs.h ############################## now goto dir /Qt-5.15 cd /Qt-5.15 call configure with ./configure -opensource -confirm-license -release -no-rpath -no-fast \     -no-sql-ibase -no-sql-mysql -no-sql-odbc -no-sql-psql -no-sql-sqlite2 \     -no-qt3support -no-mmx -no-3dnow -no-sse -no-sse2 -no-sse3 -no-ssse3 \     -no-sse4.1 -no-sse4.2 -no-avx -no-optimized-qmake -no-nis -no-cups -pch \     -reduce-relocations -force-pkg-config -prefix /usr -no-armfpa -make libs \     -nomake docs -little-endian -embedded armv6 -qt-decoration-styled \     -depths all -xplatform qws/linux-imx6-g++ -iconv -largefile -qt-gfx-linuxfb \     -qt-gfx-multiscreen -qt-mouse-pc -qt-mouse-linuxinput -qt-libpng \     -plugin-gfx-directfb -system-zlib -no-accessibility -no-gfx-transformed \     -no-gfx-qvfb -no-gfx-vnc -no-kbd-tty -no-kbd-linuxinput -no-kbd-qvfb \     -no-mouse-linuxtp -no-mouse-tslib -no-mouse-qvfb -no-libmng -no-libtiff \     -no-gif -no-libjpeg -no-freetype -no-stl -no-glib -no-openssl -no-egl \     -no-xmlpatterns -no-exceptions -no-multimedia -no-audio-backend -no-phonon \     -no-phonon-backend -no-webkit -no-script -no-scripttools -no-svg -no-script \     -no-declarative -no-sql-sqlite -no-qdbus -no-opengl -static -nomake tools \     -nomake examples -nomake demos when configuring is finished call make after a looong time, when everything goes right, we have a staticly compiled Qt. DO NOT call "make install". We will install manually: copy from /Qt-5.15/bin the files moc, uic, rcc and qmake to somewhere in PATH, eg. /sysroot-cross/bin copy the contents of dir /Qt-5.15/mkspecs to /sysroot-cross/usr/mkspec copy the contents of dir /Qt-5.15/plugins to /sysroot-cross/usr/plugins copy the contents of dir /Qt-5.15/include to /sysroot-cross/usr/include copy the contents of dir /Qt-5.15/lib to /sysroot-cross/usr/lib Test application camtest: if you don't have/want directfb plugin remove from camtest.pro the lines LIBS += -L/sysroot-cross/usr/plugins/gfxdrivers QTPLUGIN += QDirectFBScreen and the lines from main.cpp #include <QtPlugin> Q_IMPORT_PLUGIN(qdirectfbscreen) generate makefile by typing /sysroot-cross/bin/qmake -spec /sysroot-cross/usr/mkspecs/qws/linux-imx6-g++ camtest.pro then make you should set and activate your framebuffers with this script ################# snip ################################ fbset -fb /dev/fb0 -g 1024 768 1024 2304 16 echo -n 0 > /sys/class/graphics/fb0/blank fbset -fb /dev/fb1 -g 1024 768 1024 1536 32 echo -n 0 > /sys/class/graphics/fb1/blank modprobe galcore modprobe uvcvideo modprobe mxc_v4l2_capture ################# snip ################################ if you use directfb then your /etc/directfbrc file should look like this: ######################## snip /etc/directfbrc ############# system=fbdev fbdev=/dev/fb1 mode=1024x768 depth=32 pixelformat=ARGB no-cursor window-surface-policy=systemonly ######################## snip /etc/directfbrc ############# to start the application with directfb: ./camtest -qws -display directfb without directfb using linuxfb: ./camtest -qws -display linuxfb:/dev/fb1 Notes about application: 1. The application shows 2 webcams in background-framebuffer (BG-FB). The foreground-framebuffer (FG-FB) shows the qt-gui. FG-FB is configured to be fully opaque and uses color-keying. On the BG-FB one cam is overlayed on the other cam using IPU. Optimization possibilities: the app copies the frames from the cams with memcpy. This wouldn't be necessary, when the kernel usb-webcam interface (uvc) would support V4L2_MEMORY_USERPTR method. through this way, you could pass the mapped IPU mmapped inbufs directly to v4l2 output buffers. If you get errors like NOSPC (-28) from uvc, this is a limitation of USB. My board is a MX6QSabre, where the two webcams are connected to the same usb-controller. With both webcams I had to limit the frame size to 320x250 and 160x120 at 25Hz. You might try higher res if you have other type of webcams (not usb). Have fun  
View full article
Header 1 Header 2 Video rendering gst-launch videotestsrc ! mfw_v4lsink Audio rendering gst-launch audiotestsrc ! alsasink WAV Audio rendering gst-launch filesrc location=test.wav ! wavparse ! alsasink Video rendering selecting caps gst-launch videotestsrc ! capsfilter name='video/x-raw-yuv,format=(fourcc)I420' ! mfw_v4lsink gst-launch videotestsrc ! 'video/x-raw-yuv,format=(fourcc)I420' ! mfw_v4lsink
View full article
In order to get USB cameras (web cams) working on i.MX 51 EVK board running Ubuntu, a few steps must be followed, and they are: Enable USB Camera's drivers on Kernel Test it using Gstreamer or another compatible software (as Cheese) Kernel Driver USB cameras (web cameras) on Linux work over GSPCA driver, to enable this driver you need to go to: ./ltib -c   [*] Configure the kernel     Device Drivers -->          Multimedia Devices -->               [*] Video Capture Adapters -->                    [*] V4L USB Devices -->                         <*> USB Video Class (UVC)                                      [*] UVC input events device support                         <*> GSPCA Based WebCams --> From this point, you need to choose your specific driver. If you don't know, you can select all of those options as a built-in module "<*>" that will work. GSPCA Drivers USB Camera Detection Connect your USB camera to the USB Host port on i.MX 51 EVK board and then type "dmesg", and also check if there is a video0 device using: ubuntu@ubuntu-desktop:~$ ls /dev/video0 /dev/video0 USB Camera Detection Gstreamer Command Line In order to test your USB camera using Gstreamer plugin, use the following command line to perform it: ubuntu@ubuntu-desktop:~$ gst-launch-0.10 v4l2src ! ffmpegcolorspace ! ximagesink and the results: Hi there !!! EOF !
View full article
Qt Creator can be a very good IDE in order to develop great QT applications. This IDE does not only helps with syntax highlighting, access to examples and tutorials, but also helps you to configure different toolchains Qt binary versions and target options. First download the binary installer from: For 32 bits: $ wget http://releases.qt-project.org/qtcreator/2.6.2/qt-creator-linux-x86-opensource-2.6.2.bin For 64 bits: $ wget http://releases.qt-project.org/qtcreator/2.6.2/qt-creator-linux-x86_64-opensource-2.6.2.bin execute the binary $ ./qt-creator-linux-x86_64-opensource-2.6.2.bin Follow the Installer GUI and choose a location. Default options should be OK. in my case the installation was done here: $ /home/b35153/qtcreator-2.6.2/bin Open Qt Creator (in my case from command line, use "&" to regain control of the terminal) $./qtcreator & Open Tools -> Options Choose Build & Run  on the menu of the left. and Select the Compilers Tab Here you can add the toolchain GCC compiler of your convenience. It will appear in the "Manual"  section. Now click on Qt Version Tab.  Here you can add the Qmake that you had created with your Qt installation; for example, the Qt5 installation described here: Building QT for i.MX6 It will appear in the Manual section. In my case I have Qmake for PC and Qmake for i.MX6. Now click on Kits Tab Here you can create combinations of Compilers and Qmake, and also specify where do you want the executables to go. In my case here I combined the i.MX6 toolchain and the Qmake for I.MX6 i had created. I did not set up device configuration since the sysroot is already shared to my device via NFS, but you can configure it so the files are sent via ssh to your device. And that's It! Next time you load a project you can choose which Kit you want to work on, and it will be compiled just as you need.
View full article
GStreamer has a simple feature to enable tracing, allowing the developer to do basic debugging. These can be done in two ways: Adding the parameter --gst-debug=LIST to the pipeline (a pipeline is a executed gst-launch command) Prepending the environment variable GST_DEBUG=LIST' LIST is a a comma-separated argument, indicating the GStreamer elements to trace. For example, if one needs to trace the sink element      $ GST_DEBUG=*sink*:5 gst-launch playbin2 uri=file:///sample.avi or      $ gst-launch playbin2 uri=file:///sample.avi --gst-debug=*sink*:5 Both commands produces the same log. In case want to trace for than one element, so can simple add the <element>:5, for example      $ GST_DEBUG=mfw_v4lsink:5,vpudec:5 gst-launch playbin2 uri=file:///sample.avi The number 5 indicates the log category, where 5 is the highest (the most verbose log you can get) and 0 produces no output (5=LOG, 4=DEBUG, 3=INFO, 2=WARN, 1=ERROR). Log can be huge in each pipeline run. One way to filter it is using the grep command. Before grepping, one needs to redirect the standard error to the standard output (GStreamer log goes always to stderr), so      $ GST_DEBUG=mfw_v4lsink:5,vpudec:5 gst-launch playbin2 uri=file:///sample.avi 2>&1 | grep <filter string> In case the log needs to be shared, it is important to remove the 'color' of the log, again, one just needs to add the parameter --gst-debug-no-color or prepend the env variable GST_DEBUG_NO_COLOR=1 ----- More shell variables that GStreamer react, can be found here https://developer.gnome.org/gstreamer/0.10/gst-running.html
View full article
Overview: This document is written for Freescale customers who have Freescale AC3 release packages (excluded package). (If you did not have the AC3 release package, you can disregard this document.) Freescale OMX Player in Android release supports audio track selection when playing files with multiple audio tracks. However, most customers don't use this enhanced API to select the audio track even if current audio codec is not supported. To avoid a soundless output when partial audio track can be played, this document provides the method to select the available audio track automatically to play. The patch in this document is not included in our current release because it did not match with our track selection rule - play the first track. If you have any idea with this issue, feel free to add comments into this document. Issue description: Software: R13.4-GA or R13.4.1 Android releases Hardware: MX6Dual/Quad SabreSD board Test source: 1.mkv Test Step: 1. Lunch Gallery from main menu. 2. Play the video And you can see the watch the video without any sound Root Reason: The file has 2 audio track DTS & AC3: audio track 1 is DTS and track 2 is AC3. OMX Player will choose the first audio track to play as default audio track, which is DTS audio. However, the software only supports the AC3 audio codec, so it could not set up audio decoder for DTS track. If we choose to play the AC3 track, sounds could be heard. How to fix: The audio track index is set in GMPlayer::LoadParser(). You can get audio format to check whether it is supported by decoder. Please see the patch audio_track_slection.diff
View full article