i.MX Processors Knowledge Base

cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 

i.MX Processors Knowledge Base

Discussions

Sort by:
Multiple-Overlay (or Multi-Overlay) means several video playbacks on a single screen. In case multiple screens are needed, check the dual-display case GStreamer i.MX6 Multi-Display $ export VSALPHA=1 $ SAMPLE1=sample1.avi; SAMPLE2=sample2.avi; SAMPLE3=sample3.avi; SAMPLE4=sample4.avi; $ WIDTH=320; HEIGHT=240; SEP=20 Four displays (2x2) $gst-launch \ playbin2 uri=file://`pwd`/$SAMPLE1 video-sink="mfw_isink axis-top=0 axis-left=0   disp-width=$WIDTH disp-height=$HEIGHT" \ playbin2 uri=file://`pwd`/$SAMPLE2 video-sink="mfw_isink axis-top=0 axis-left=`expr $WIDTH + $SEP` disp-width=$WIDTH disp-height=$HEIGHT" \ playbin2 uri=file://`pwd`/$SAMPLE3 video-sink="mfw_isink axis-top=`expr $HEIGHT + $SEP` axis-left=0   disp-width=$WIDTH disp-height=$HEIGHT" \ playbin2 uri=file://`pwd`/$SAMPLE4 video-sink="mfw_isink axis-top=`expr $HEIGHT + $SEP` axis-left=`expr $WIDTH + $SEP` disp-width=$WIDTH disp-height=$HEIGHT" Basic rotation, (2 x 1, normal and inverted) gst-launch \ playbin2 uri=file://`pwd`/$SAMPLE1 video-sink="mfw_isink axis-top=0 axis-left=0   disp-width=$WIDTH disp-height=$HEIGHT rotation=0" \ playbin2 uri=file://`pwd`/$SAMPLE2 video-sink="mfw_isink axis-top=`expr $HEIGHT + $SEP` axis-left=0 disp-width=$WIDTH disp-height=$HEIGHT rotation=3"
View full article
In some cases it is desired to directly have progressive content available from a TV-IN interface through the V4L2 capture device. In the BSP, HW accelerated de-interlacing is only supported in the V4L2 output stream. Below is a patch created against a rather old BSP version that adds support for de-interlaced V4L2 capture. The patch might need to be adapted to newer BSPs, However, the logic and functionality is there and should shorten the development time. This patch adds another input device to the V4L2 framework that can be selected to perform the deinterlacing on the way to memory. The selection is done by passing the index “2” as an argument to the VIDIOC_S_INPUT  V4L2 ioctl. Attached is also a modified the tvin unit test to give an example of how to use the new driver. An example sequence for running the test is as follows: modprobe mxc_v4l2_capture ./mxc_v4l2_tvin_vdi.out -ow 720 -oh 480 -ol 10 -ot 20 -f YU12 Some key things to note: This driver does not support resize or color space conversion on the way to memory. The requested format and size should match what can be provided directly by the sensor. The driver was tested on a Sabre AI Rev A board running Linux 12.02. This code is not an official delivery and as such no guarantee of support for this code is provided by Freescale.
View full article
This is the prototype solution to enable second display showing different things on JB4.2.2 SabreSD. Make use of Class Presentation provided by android to be embedded into Status bar. When unlock the screen, the Presentation will show on second display. Now, the solution requires one .mp4 video placed in root sdcard. Of course, you may change it to show anything. The attached Files are a layout xml file, a patch and a recorded video. The layout file should be put into android/frameworks/base/packages/SystemUI/res/layout/ folder. The patch should be applied to frameworks/base.git. The recorded video shows the dual display demo as a reference.
View full article
Qt Creator can be a very good IDE in order to develop great QT applications. This IDE does not only helps with syntax highlighting, access to examples and tutorials, but also helps you to configure different toolchains Qt binary versions and target options. First download the binary installer from: For 32 bits: $ wget http://releases.qt-project.org/qtcreator/2.6.2/qt-creator-linux-x86-opensource-2.6.2.bin For 64 bits: $ wget http://releases.qt-project.org/qtcreator/2.6.2/qt-creator-linux-x86_64-opensource-2.6.2.bin execute the binary $ ./qt-creator-linux-x86_64-opensource-2.6.2.bin Follow the Installer GUI and choose a location. Default options should be OK. in my case the installation was done here: $ /home/b35153/qtcreator-2.6.2/bin Open Qt Creator (in my case from command line, use "&" to regain control of the terminal) $./qtcreator & Open Tools -> Options Choose Build & Run  on the menu of the left. and Select the Compilers Tab Here you can add the toolchain GCC compiler of your convenience. It will appear in the "Manual"  section. Now click on Qt Version Tab.  Here you can add the Qmake that you had created with your Qt installation; for example, the Qt5 installation described here: Building QT for i.MX6 It will appear in the Manual section. In my case I have Qmake for PC and Qmake for i.MX6. Now click on Kits Tab Here you can create combinations of Compilers and Qmake, and also specify where do you want the executables to go. In my case here I combined the i.MX6 toolchain and the Qmake for I.MX6 i had created. I did not set up device configuration since the sysroot is already shared to my device via NFS, but you can configure it so the files are sent via ssh to your device. And that's It! Next time you load a project you can choose which Kit you want to work on, and it will be compiled just as you need.
View full article
Multiple-Display means video playback on multiple screens. In case playback needs to be in a unique screen, the mfw_isink element must be used and some pipelines examples can be found on this link: GStreamer iMX6 Multi-Overlay. Number of Displays Display type Kernel parameters Pipelines # Set these shells variables before running the pipelines alias gl=gst-launch SINK_1="\"mfw_v4lsink device=/dev/video17\"" SINK_2="\"mfw_v4lsink device=/dev/video18\"" SINK_3="\"mfw_v4lsink device=/dev/video20\"" media1=file:///root/media1 media2=file:///root/media2 media3=file:///root/media3 2 hdmi + lvds video=mxcfb0:dev=hdmi,1920x1080M@60,if=RGB24 video=mxcfb1:dev=ldb,LDB-XGA,if=RGB666 gl playbin2 uri=$media1 video-sink=$SINK_1 playbin2 uri=$media2 video-sink=$SINK_2 2 lvds + lvds video=mxcfb0:dev=ldb,LDB-XGA,if=RGB666 video=mxcfb1:dev=ldb,LDB-XGA,if=RGB666 gl playbin2 uri=$media1 video-sink=$SINK_1 playbin2 uri=$media2 video-sink=$SINK_2 2 lcd + lvds video=mxcfb0:dev=lcd,800x480M@55,if=RGB565 video=mxcfb1:dev=ldb,LDB-XGA,if=RGB666 gl playbin2 uri=$media1 video-sink=$SINK_1 playbin2 uri=$media2 video-sink=$SINK_2 3 hdmi + lvds + lvds video=mxcfb0:dev=hdmi,1920x1080M@60,if=RGB24 video=mxcfb1:dev=ldb,LDB-XGA,if=RGB6 video=mxcfb2:dev=ldb,LDB-XGA,if=RGB666 gl playbin2 uri=$media1 video-sink=$SINK_1 playbin2 uri=$media2 video-sink=$SINK_2 playbin2 uri=$media3 video-sink=$SINK_3
View full article
                                                                                         Watch the Freescale i.MX team boot up Android 5.0 Lollipop in i.mx6 application processors—在线播放—优酷网,视频高清在线观看 The Freescale i.MX Android team has booted up Android 5.0 Lollipop in the SABRE platform for i.mx6 series. Google pushed all of the latest source for its Android release to AOSP on Nov. 5, and the Freescale Android Team started their work. With the previous 6 days to boot Android Lollipop up, the Freescale i.MX Android team enabled the basic features like connectivity, audio/video playback, sensors, inputs and display on day 7! You can see the some changes in the demo video at the beginning of the post. The Freescale i.MX Android team has closely followed almost every version of Android since it is released by AOSP and has good experience on it. Below are some snapshots and pictures for the Android Lollipop.
View full article
1. Set up HDMI Set up your kernel to use HDMI adding the following code to bootargs on u-boot: video=mxcfb0:dev=hdmi,1920x1080M@60,if=RGB24 2. Test raw audio In order to test only raw audio, use the following command: aplay -D hw:1,0 Kaleidoscope.wav 3. Make HDMI audio the default output In order to configure audio output over HDMI, please, replace content of file ~/.asoundrc to the following one pcm.dmix_48000{      type dmix      ipc_key 5678293      ipc_key_add_uid yes      slave{           pcm "hw:1,0"           period_time 0           period_size 2048           buffer_size 24576           format S16_LE           rate 48000      } } pcm.!dsnoop_44100{      type dsnoop      ipc_key 5778293      ipc_key_add_uid yes      slave{           pcm "hw:0,0"           period_time 0           period_size 2048           buffer_size 24576           format S16_LE           rate 44100      } } pcm.!dsnoop_48000{      type dsnoop      ipc_key 5778293      ipc_key_add_uid yes      slave{           pcm "hw:1,0"           period_time 0           period_size 2048           buffer_size 24576           format S16_LE           rate 48000      } } pcm.asymed{      type asym      playback.pcm "dmix_48000"      capture.pcm "dsnoop_44100" } pcm.dsp0{      type plug      slave.pcm "asymed" } pcm.!default{      type plug      route_policy "average"      slave.pcm "asymed" } ctl.mixer0{      type hw      card 0 } This will configure alsa to use sound card hw:1,0. Please, pay attention to use the proper audio card name for your device. In order to see available sound cards on board: root@imx53qsb:~# aplay -l **** List of PLAYBACK Hardware Devices **** card 0: imx3stack [imx-3stack], device 0: SGTL5000 SGTL5000-0 []   Subdevices: 1/1   Subdevice #0: subdevice #0 card 1: imx3stackspdif [imx-3stack-spdif], device 0: IMX SPDIF mxc spdif-0 []   Subdevices: 1/1   Subdevice #0: subdevice #0 For detail on how to create asound.conf, please see alsa-lib configuration introduction. 4. Encoded audio For encoded (i.e. AC3, DTS) audio, you can use, for example, ac3dec, an utility provided by alsa-tools with the following command line: ac3dec -D hw:1,0 -C test.ac3 This would work for both HDMI audio and SPDIF audio. Double check your hardware and/or schematic in order to know which one to use.
View full article
In this article, some experiments are done to verify the capability of i.MX6DQ on video playback under different VPU clocks. 1. Preparation Board: i.MX6DQ SD Bitstream: 1080p sunflower with 40Mbps, it is considered as the toughest H264 clip. The original clip is copied 20 times to generate a new raw video (repeat 20 times of sun-flower clip) and then encapsulate into a mp4 container. This is to remove and minimize the influence of startup workload of gstreamer compared to vpu unit test. Kernels: Generate different kernel with different VPU clock setting: 270MHz, 298MHz, 329MHz, 352MHz, 382MHz. test setting: 1080p content decoding and display with 1080p device. (no resize) 2. Test command for VPU unit test and Gstreamer The tiled format video playback is faster than NV12 format, so in below experiment, we choose tiled format during video playback. Unit test command: (we set the frame rate -a 70, higher than 1080p 60fps HDMI refresh rate)     /unit_tests/mxc_vpu_test.out -D "-i /media/65a78bbd-1608-4d49-bca8-4e009cafac5e/sunflower_2B_2ref_WP_40Mbps.264 -f 2 -y 1 -a 70" Gstreamer command: (free run to get the highest playback speed)     gst-launch filesrc location=/media/65a78bbd-1608-4d49-bca8-4e009cafac5e/sunflower_2B_2ref_WP_40Mbps.mp4 typefind=true ! aiurdemux ! vpudec framedrop=false ! queue max-size-buffers=3 ! mfw_v4lsink sync=false 3. Video playback framerate measurement During test, we enter command "echo performance > /sys/devices/system/cpu/cpu0/cpufreq/scaling_governor" to make sure the CPU always work at highest frequency, so that it can respond to any interrupt quickly. For each testing point with different VPU clock, we do 5 rounds of tests. The max and min values are removed, and the remaining 3 data are averaged to get the final playback framerate. #1 #2 #3 #4 #5 Min Max Avg Dec Playback Dec Playback Dec Playback Dec Playback Dec Playback Playback Playback Playback 270M unit test 57.8 57.3 57.81 57.04 57.78 57.3 57.87 56.15 57.91 55.4 55.4 57.3 56.83 GST 53.76 54.163 54.136 54.273 53.659 53.659 54.273 54.01967 298M unit test 60.97 58.37 60.98 58.55 60.97 57.8 60.94 58.07 60.98 58.65 57.8 58.65 58.33 GST 56.755 49.144 53.271 56.159 56.665 49.144 56.755 55.365 329M unit test 63.8 59.52 63.92 52.63 63.8 58.1 63.82 58.26 63.78 59.34 52.63 59.52 58.56667 GST 57.815 55.857 56.862 58.637 56.703 55.857 58.637 57.12667 352M unit test 65.79 59.63 65.78 59.68 65.78 59.65 66.16 49.21 65.93 57.67 49.21 59.68 58.98333 GST 58.668 59.103 56.419 58.08 58.312 56.419 59.103 58.35333 382M unit test 64.34 56.58 67.8 58.73 67.75 59.68 67.81 59.36 67.77 59.76 56.58 59.76 59.25667 GST 59.753 58.893 58.972 58.273 59.238 58.273 59.753 59.03433 Note: Dec column means the vpu decoding fps, while Playback column means overall playback fps. Some explanation: Why does the Gstreamer performance data still improve while unit test is more flat? On Gstreamer, there is a vpu wrapper which is used to make the vpu api more intuitive to be called. So at first, the overall GST playback performance is constrained by vpu (vpu dec 57.8 fps). And finally, as vpu decoding performance goes to higher than 60fps when vpu clock increases, the constraint becomes the display refresh rate 60fps. The video display overhead of Gstreamer is only about 1 fps, similar to unit test. Based on the test result, we can see that for 352MHz, the overall 1080p video playback on 1080p display can reach ~60fps. Or if time sharing by two pipelines with two displays, we can do 2 x 1080p @ 30fps video playback. However, this experiment is valid for 1080p video playback on 1080p display. If for interlaced clip and display with size not same as 1080p, the overall playback performance is limited by some postprocessing like de-interlacing and resize.
View full article
Test digital zoom with ipu for camera preview.   Board :sarbre-sd (imx6dq) BSP   : android 13.4ga In the above flow, one frame buffer is processed in four steps at camera preview. Add the step to change the frame buffer before step 4 , the added step which  zoom one preview frame.   The figure below shows the crop function of ipu lib, we use this function scale the frame.   Test result: preview zoom levle 0:   preview zoom level max:     When taking pictures with 5M pixels and the zoom is over level 1, the picture size is not 2592x1944 but 2016x1512. The underlying reason for it is that ipu crop function only supports the 2048x2048 maximum output .   Thumbnails of test result :  
View full article
Note: All these gstreamer pipelines have been tested using a i.MX6Q board with a kernel version 3.0.35-2026-geaaf30e. Tools: gst-launch gst-inspect FSL Pipeline Examples: GStreamer i.MX6 Decoding GStreamer i.MX6 Encoding GStreamer Transcoding and Scaling GStreamer i.MX6 Multi-Display GStreamer i.MX6 Multi-Overlay GStreamer i.MX6 Camera Streaming GStreamer RTP Streaming Other plugins: GStreamer ffmpeg GStreamer i.MX6 Image Capture GStreamer i.MX6 Image Display Misc: Testing GStreamer Tracing GStreamer Pipelines GStreamer miscellaneous
View full article
Overview As more and more communication required between online and offline, the QR code is widely used in the mobile payment, mobile small apps, industry things identification and etc. The i.MX6UL/ULL has the IP of CSI and PXP for camera connection and image CSC/FLIP/ROTATION acceleration. A LCDIF IP is supporting the display, but no 3D IP support. This means this low power and low end AP is very suitable for the industry HMI segment, which does not require a cool 3D graphic display, but a simple and straightforward GUI for interaction. QR code scanner is one of the use cases in the industry segment, which more and more customer are focusing on. The i.MX6UL CPU freq of i.MX6UL is about 500Mhz, and it does not have GPU IP, so a lightweight GUI and window system is required. Here we recommend the QT with wayland backend (without X11), which would make the window system small and faster than traditional X11 UI. Why chose QT is because of it has open source version, rich components, platform independent, good performance for embedded system and strong development staffs like QtCreator for creating application. How to enable the QT development environment, check this: Enable QT developement for i.MX6UL (v2)  Here I made a QR code scanner demo based on QT5.6 + QZXing (QR/Bar code scan engine) running on the i.MX6UL EVK board with a UVC camera (at least 640x480 resolution is required) and 480x272px LCD. Source code is open here (License Apache2.0): https://github.com/muddog/QRScanner  Implementation To do camera preview and capture, you must think on the gstreamer first, which is easy use and has the acceleration pads which implemented by NXP for i.MX6UL. Yes, it's very easy for you to enable the preview in console like: $ gst-launch-1.0 v4l2src device=/dev/video1 ! video/x-raw,format=YUY2,width=640,height=320 ! imxvideoconvert_pxp ! video/x-raw,format=RGB16 ! waylandsink It works under the i.MX6UL EVK, with PXP IP to do color space convert from YUY2 -> RGB16 acceleration, also the potential scaling of the image. The CPU loading of this is about 20-30%, but if you use the component of "videoconvert" to replace the "imxvideoconvert_pxp", we do CSC and scale by CPU, then the loading would increase to 50-60%. The "/dev/video1" is the device node for UVC camera, it may different in your environment. So our target is clear, create such pipeline (with PXP acceleration) in the QT application, and use a appsink to get preview images, do simple "sink" to one QWidget by drawing this image on the widget surface for preview (say every 50ms for 20fps). Then in other thread, we fetch the preview buffer in a fixed frequency (like every 0.5s), then feed it into the ZXing engine to decode the strings inside this image. Here are the class created inside the source code: ScannerQWidgetSink It act as a gstreamer sink for preview rendering. Init the pipeline, create a timer with timeout every 50ms. In the timer handler, we use appsink to copy the camera buffer from gstreamer, and tell the ViewfinderWidget to do update (re-draw event). ViewfinderWidget This class inherit from the QWidget, which draw the preview buffer as a QImage onto it's own surface by using QPainter. The QImage is created at the very begining with the image buffer created by the ScannerQWidgetSink. Because QImage itself does not maintain the image buffer, so the buffer must be alive during it's usage. So we keep this buffer during the ScannerQWidgetSink life cycle, copy the appsink buffer from pipeline to it for preview. MainWindow Create main window, which does not have title bar and border. Start any animation for the red line scan bar. Create instance of DecoderThread and ScannerQWidgetSink. Setup and start them. DecoderThread A infinite loop, to wait for a available buffer released by the ScannerQWidgetSink every 0.5s. Copy the buffer data to it's own buffer (imgData) to avoid any change to the buffer by sink when doing decoding. Then feed this copy of buffer into ZXing engine to get decoder result. Then show on the QLabel. Screenshot under wayland (weston) desktop: Customize Camera instance Now I use the UVC camera which pluged in the USB host, which device node is /dev/video1. If you want to use CSI or other device, please change the construction parameters for ScannerQWidgetSink(): sink = new ScannerQWidgetSink(ui->widget, QString("v4l2src device=/dev/video1")); Image resolution captured and review Change the static member value of ScannerQWidgetSink class: uint ScannerQWidgetSink::CAPTURE_HEIGHT = 480; uint ScannerQWidgetSink::CAPTURE_WIDTH = 640; Preview fps and decoding frequency Find the "framerate=20/1" strings in the ScannerQWidgetSink::GstPipelineInit(), change to your fps. You also have to change the renderTimer start timeout value in the ::StartRender(). The decoding frequency is determined by renderCnt, which determine after how many preview frames showed to feed the decoder. Main window size It's fixed size of main window, you have to change the mainwindow.ui. It's easy to do in the QtCreate Designer. FAQ Why not use CSI camera in demo? Honestly, I do not have CSI camera module, it's also DNP when you buying the board on NXP.com. So a widely used UVC camera is preferred, it's also easy for you to scan QR code on your phone, your display panel etc. Why not use QCamera to do preview and capture? The QCamera class in the Qtmultimedia component uses the camerabin2 gstreamer plugin, which create a very long pipeline for different usage of viewfinder, image capture and video encoder. Camerabin2 would eat too much CPU and memory resource, take picture and recording are very very slow. The preview of 30fps would eat about 70-80% CPU loading even I hacked it using imxvideoconvert_pxp instread of software videoconvert. Finally I give up to implement the QRScanner based on QCamera. How to make sure only one instance of QT app is running? We can use QSharedMemory to create a share memory with a unique KEY. When second instance of app is started, it would check if the share memory with this KEY is created or not. If the shm is there, it means there's already one instance running, it has to exit(). But as the QT mentioned, the QSharedMemory can not be destroyed correctly when app crashed, this means we have to handle each terminate signal, and do delete by ourselves: static QSharedMemory *gShm = NULL; static void terminate(int signum) {    if (gShm) {       delete gShm;       gShm = NULL;    }    qDebug() << "Terminate with signal:" << signum;    exit(128 + signum); } int main(int argc, char *argv[]) {    QApplication a(argc, argv);    // Handle any further termination signals to ensure the    // QSharedMemory block is deleted even if the process crashes    signal(SIGHUP, terminate ); // 1    signal(SIGINT, terminate ); // 2    signal(SIGQUIT, terminate ); // 3    signal(SIGILL, terminate ); // 4    signal(SIGABRT, terminate ); // 6    signal(SIGFPE, terminate ); // 8    signal(SIGBUS, terminate ); // 10    signal(SIGSEGV, terminate ); // 11    signal(SIGSYS, terminate ); // 12    signal(SIGPIPE, terminate ); // 13    signal(SIGALRM, terminate ); // 14    signal(SIGTERM, terminate ); // 15    signal(SIGXCPU, terminate ); // 24    signal(SIGXFSZ, terminate ); // 25    gShm = new QSharedMemory("QRScannerNXP");    if (!gShm->create(4, QSharedMemory::ReadWrite)) {       delete gShm;       qDebug() << "Only allow one instance of QRScanner";       exit(0);    } .....
View full article
There is no Freescale GStreamer element which does the JPEG decoding, so we must rely on a standard one, like 'jpegdec'. In case your Linux system was built using LTIB, in order to have the jpegdec element included on the gst-plugin-good, follow these steps: On the LTIB menuconfig, make sure the following packages are selected: gstreamer-plugins-good libjpeg libpng Remove the configure parameters '--disbale-libpng' and '--disable-jpeg' on the file './dist/lfs-5.1/gst-plugins-good/gst-plugins-good.spec' Rebuild and flash your board (or SD card) again. Image display VSALPHA=1 gst-launch filesrc location=sample.jpeg ! jpegdec ! imagefreeze ! mfw_isink Important: non 8 pixel aligned width and height is treated as not supported format in isink plugin.
View full article
Notes: First run the playback pipeline then the streaming pipeline. The above example streams H263 video and AMR audio data. Change codec format to your needs. In case where the iMX is the streaming machine, the audio encoder 'amrnbenc' must be installed before. This scenario has not been tested Shell variables and pipelines Playback machine (receiver) # On playback machine, set either IMX2PC or PC2IMX variables, then run the pipeline ## IMX2PC: Case where PC does the playback     AUDIO_DEC_SINK="rtpamrdepay ! amrnbdec ! alsasink "     VIDEO_CAPS="\"application/x-rtp,media=(string)video,clock-rate=(int)90000,encoding-name=(string)H263-1998\""     VIDEO_DEC_SINK="rtph263pdepay ! ffdec_h263 ! autovideosink" ## End of IMX2PC Settings ## PC2IMX: Case where iMX does the playback     AUDIO_DEC_SINK="rtpamrdepay ! mfw_amrdecoder ! alsasink "     VIDEO_CAPS="\"application/x-rtp,media=(string)video,clock-rate=(int)90000,encoding-name=(string)H263-1998\""     VIDEO_DEC_SINK="rtph263pdepay ! vpudec ! mfw_v4lsink " ## End of PC2IMX Settings PLAYBACK_AUDIO="udpsrc caps=\"application/x-rtp,media=(string)audio,clock-rate=(int)8000,encoding-name=(string)AMR,encoding-params=(string)1,octet-align=(string)1\" \             port=5002 ! rtpbin.recv_rtp_sink_1 \         rtpbin. ! $AUDIO_DEC_SINK \      udpsrc port=5003 ! rtpbin.recv_rtcp_sink_1 \      rtpbin.send_rtcp_src_1 ! udpsink port=5007 sync=false async=false" PLAYBACK_VIDEO="udpsrc caps=$VIDEO_CAPS port=5000 ! rtpbin.recv_rtp_sink_0 \         rtpbin. ! $VIDEO_DEC_SINK \         udpsrc port=5001 ! rtpbin.recv_rtcp_sink_0 \         rtpbin.send_rtcp_src_0 ! udpsink port=5005 sync=false async=false" PLAYBACK_AV="$PLAYBACK_VIDEO $PLAYBACK_AUDIO" # Playback pipeline gst-launch -v gstrtpbin name=rtpbin $PLAYBACK_AV Streaming Machine (sender) # On Streaming machine, set either IMX2PC or PC2IMX variables, then run the pipeline ## IMX2PC: Case where iMX does the streaming     IP=x.x.x.x # IP address of the playback machine     VIDEO_SRC="mfw_v4lsrc"     VIDEO_ENC="vpuenc codec=h263 ! rtph263ppay "    AUDIO_ENC="audiotestsrc ! amrnbenc ! rtpamrpay " ## END IMX2PC settings ## PC2IMX: Case where PC does the streaming     IP=y.y.y.y # IP address of the playback machine     VIDEO_SRC="v4l2src"     VIDEO_ENC="ffenc_h263 ! rtph263ppay "     AUDIO_ENC="audiotestsrc ! amrnbenc ! rtpamrpay " # END PC2PC settings STREAM_AUDIO="$AUDIO_ENC ! rtpbin.send_rtp_sink_1 \         rtpbin.send_rtp_src_1 ! udpsink host=$IP port=5002 \         rtpbin.send_rtcp_src_1 ! udpsink host=$IP port=5003 sync=false async=false \         udpsrc port=5007 ! rtpbin.recv_rtcp_sink_1" STREAM_VIDEO="$VIDEO_SRC ! $VIDEO_ENC ! rtpbin.send_rtp_sink_0 \         rtpbin.send_rtp_src_0 ! queue ! udpsink host=$IP port=5000 \         rtpbin.send_rtcp_src_0 ! udpsink host=$IP port=5001 sync=false async=false \         udpsrc port=5005 ! rtpbin.recv_rtcp_sink_0" STREAM_AV="$STREAM_VIDEO $STREAM_AUDIO" # Stream pipeline gst-launch -v gstrtpbin name=rtpbin $STREAM_AV
View full article
This document describes all the i.MX 8 MIPI-CSI use cases, showing the available cameras and daughter cards supported by the boards, the compatible Device Trees (DTS) files, and how to enable these different camera options on the i.MX 8 boards. Plus, this document describes some Advanced camera use cases too, such as multiples cameras output using imxcompositor_g2d plugin, GStreamer zero-copy pipelines and V4L2 API extra-controls examples.
View full article
BlueZ5 provides support for the core Bluetooth layers and protocols. It is flexible, efficient and uses a modular implementation. BlueZ5 has implemented the Bluetooth low level host stack for Bluetooth core specification 4.0 and 3.0+HS which includes GAP, L2CAP, RFCOMM, and SDP. Besides the host stack, BlueZ5 has also supported the following profiles itself or via a third party software. Profiles provided by BlueZ: A2DP 1.3 AVRCP 1.5 DI 1.3 HDP 1.0 HID 1.0 PAN 1.0 SPP 1.1 GATT (LE) profiles: PXP 1.0 HTP 1.0 HoG 1.0 TIP 1.0 CSCP 1.0 OBEX based profiles (by obexd): FTP 1.1 OPP 1.1 PBAP 1.1 MAP 1.0 Provided by the oFono project: HFP 1.6 (AG & HF)Supported Profiles BlueZ5 has been supported in the latest Freescale Linux BSP release, so it would be pretty easy to generate the binaries for Bluetooth core stack and its profiles. In order to support A2DP sink on a SabreSD board, the following software should be downloaded and installed onto the target rootfs too. sbc decoder version 1.3 (http://www.kernel.org/pub/linux/bluetooth/sbc-1.3.tar.gz) PulseAudio 5.0 (http://www.freedesktop.org/software/pulseaudio/releases/pulseaudio-5.0.tar.xz) PulseAudio package has some dependencies with bluetooth and sbc packages, and pulseaudio will detect if the two packages have been built and then decide which pulse plugin modules to be generated. So the building order will be 1) bluez5_utils or bluez_utils   2) sbc   3) pulseaudio. After compile and install the above software onto the target rootfs, you should be able to see the following executable under the directory /usr/bin From BlueZ5: bluetoothctl, hciconfig, hciattach (Needed by operating a UART bluetooth module) From PulseAudio: pulseaudio, pactl, paplay If the building dependency has been setup correctly, the following pulse plugin modules should be located under the directory /usr/lib/pulse-5.0/modules module-bluetooth-discover.so      module-bluetooth-policy.so        module-bluez5-device.so   module-bluez5-discover.so Edit the file /etc/dbus-1/system.d/pulseaudio-system.conf, and add the following lines in red: <policy user="pulse">     <allow own="org.pulseaudio.Server"/>    <allow send_destination="org.bluez"/>     <allow send_interface="org.freedesktop.DBus.ObjectManager"/> </policy> Edit the file /etc/dbus-1/system.d/bluetooth.conf, and add the following lines: <policy user="pulse">      <allow send_destination="org.bluez"/>      <allow send_interface="org.freedesktop.DBus.ObjectManager"/> </policy> Adding the following settings at the bottom of the pulseaudio system configuration file which locates in /etc/pulse/system.pa ### Automatically load driver modules for Bluetooth hardware .ifexists module-bluetooth-policy.so load-module module-bluetooth-policy .endif .ifexists module-bluetooth-discover.so load-module module-bluetooth-discover .endif load-module module-switch-on-connect load-module module-alsa-sink device_id=0 tsched=true tsched_buffer_size=1048576 tsched_buffer_watermark=262144 On the system that can automatically detect the alsa cards, the above line #13 should be removed.  Also make sure "auth-anonymous=1" is added to the following line, which can resolve the issue: "Denied access to client with invalid authorization data". load-module module-native-protocol-unix auth-anonymous=1 Selecting a audio re-sampling algorithm and configuring the audio output by adding the following settings to the file daemon.conf locating in /etc/pulse resample-method = trivial enable-remixing = no enable-lfe-remixing = no default-sample-format = s16le default-sample-rate = 48000 alternate-sample-rate = 24000 default-sample-channels = 2 Pulseaudio can be started as a daemon or as a system-wide instance. To run PulseAudio in system-wide mode, the program will automatically drop privileges from "root" and change to the "pulse" user and group. In this case, before launching the program, the "pulse" user and group needs to be created on the target system.  In the example below, "/var/run/pulse" is the home directory for "pulse" user. adduser -h /var/run/pulse pulse addgroup pulse-access adduser pulse pulse-access Because PulseAudio needs to access the sound devices, add the user "pulse" to the "audio" group too. adduser pulse audio Starting bluetoothd and pulseaudio: /usr/libexec/bluetooth/bluetoothd -d & pulseaudio --system --realtime & To verify if the pulseaudio has been set up correctly, you can play a local wave file by using the following command. If you can hear the sound, the system should have been configured correctly. paplay -vvv audio8k16S.wav After setting up the pulseaudio, launch bluetoothctl to pair and connect to a mobile phone. After connecting to a mobile phone, you should be able to see the following information in bluetoothctl console: [bluetooth]# show Controller 12:60:41:7F:03:00         Name: BlueZ 5.21         Alias: BlueZ 5.21         Class: 0x1c0000         Powered: yes         Discoverable: no         Pairable: yes         UUID: PnP Information           (00001200-0000-1000-8000-00805f9b34fb)         UUID: Generic Access Profile    (00001800-0000-1000-8000-00805f9b34fb)         UUID: Generic Attribute Profile (00001801-0000-1000-8000-00805f9b34fb)         UUID: A/V Remote Control        (0000110e-0000-1000-8000-00805f9b34fb)         UUID: A/V Remote Control Target (0000110c-0000-1000-8000-00805f9b34fb)         UUID: Message Notification Se.. (00001133-0000-1000-8000-00805f9b34fb)         UUID: Message Access Server     (00001132-0000-1000-8000-00805f9b34fb)         UUID: Phonebook Access Server   (0000112f-0000-1000-8000-00805f9b34fb)         UUID: IrMC Sync                 (00001104-0000-1000-8000-00805f9b34fb)         UUID: OBEX File Transfer        (00001106-0000-1000-8000-00805f9b34fb)         UUID: OBEX Object Push          (00001105-0000-1000-8000-00805f9b34fb)         UUID: Vendor specific           (00005005-0000-1000-8000-0002ee000001)         UUID: Audio Source              (0000110a-0000-1000-8000-00805f9b34fb)         UUID: Audio Sink                (0000110b-0000-1000-8000-00805f9b34fb)         Modalias: usb:v1D6Bp0246d0515         Discovering: no If you can see the audio sink UUID, you are ready to enjoy the bluetooth music now.
View full article
Video, bad performance gst-launch filesrc location=test.mp4 typefind=true ! aiurdemux ! vpudec ! mfw_v4lsink Video, better performance gst-launch filesrc location=sample.mp4 typefind=true ! aiurdemux ! queue max-size-time=0 ! vpudec ! mfw_v4lsink # typefind=true allows to 'type find' the source file before negotiating # max-size-time=0 indicates to ignore possible blocking issues # In case of ASF files gst-launch filesrc location=sample.asf typefind=true ! aiurdemux ! queue max-size-time=0 ! mfw_wmvdecoder ! mfw_v4lsink Audio gst-launch filesrc location=sample.mp3  typefind=true ! beepdec ! audioconvert  ! 'audio/x-raw-int, channels=2' ! alsasink Audio with visualization gst-launch filesrc location=sample.mp3 typefind=true ! beepdec ! tee name=t ! queue ! audioconvert  ! 'audio/x-raw-int, channels=2' ! alsasink t. ! queue ! audioconvert ! goom ! autovideoconvert ! autovideosink Video/Audio long version gst-launch filesrc location=sample.avi typefind=true ! aiurdemux name=demux demux. ! queue max-size-buffers=0 max-size-time=0 ! vpudec ! mfw_v4lsink demux. ! queue max-size-buffers=0 max-size-time=0 ! beepdec ! audioconvert ! 'audio/x-raw-int, channels=2' ! alsasink # queue properties, max-size-buffers=0 and max-size-time=0, allows a smoother playback; type 'gst-inspect queue' for more info VA short version gplay sample.avi VA short version gst-launch playbin2 uri=file://<full path to sample file>
View full article
Dumping the pipeline elements into a image file # On target, run the pipeline $ export GST_DEBUG_DUMP_DOT_DIR=<folder where dot files are created> $ gst-launch playbin2 uri=file://${avi} $ # Move the .dot files to a host machine (scp, etc) # On Host dot <dot file> -Tpng -o out.png # dot command is part the the graphviz package Querying which elements are being used on a gst-launch command GST_DEBUG=GST_ELEMENT_FACTORY:3 gst-launch playbin2 uri=file://`pwd`/<media file> Interrupting a gst-launch process running in the background kill -INT $PID # where $PID is the process ID Using only SW codecs # Backup and remove $ find /usr/lib/gstreamer-0.10 -name "libmfw*" | grep -v sink | xargs tar cvf /libmfw_gst.tar $ find /usr/lib/gstreamer-0.10 -name "libmfw*" | grep -v sink | xargs rm # Run your pipeline. This time SW codecs are used $ gst-launch playbin2 uri=file://`pwd`/media_file # To 'install' FSL plugins again, just untar the file $ cd / && tar xvf libmfw_gst.tar && cd - # then run your pipeline. This time HW codecs are used $ gst-launch playbin2 uri=file://`pwd`/media_file
View full article
Description about VPU & IPU usage in Android R13.4 GA release for i.MX6DQ
View full article
If you want to use a USB camera (these types of cameras are also called 'Web Cameras') with GStreamer on i.MX6 devices (Linux Kernel version >= 3.035), you need to either load the module dynamically or compile and link statically selecting (Y) the following config on the Kernel configuration      Device Drivers -> Multimedia support -> Video capture adapters -> V4L USB devices -> <*> USB Video Class (UVC) After the Kernel image has been built, flash it into the target, plug the web cam, then on a (target) terminal run      gst-launch v4l2src ! mfw_v4lsink You should see what the camera is capturing on the display. In case you need to encode the camera src data, you need to place the encoder into the pipeline      gst-launch v4l2src num-buffers=100  ! queue ! vpuenc codec=0 ! matroskamux ! filesink location=output.mkv sync=false We are using a certain codec (codec=0 means mpeg4), check options using 'gst-inspect vpuenc'.
View full article
Audio, from a file gst-launch filesrc location=test.wav ! wavparse ! mfw_mp3encoder ! filesink location=output.mp3 Audio Recording gst-launch alsasrc num-buffers=$NUMBER blocksize=$SIZE ! mfw_mp3encoder ! filesink location=output.mp3 # where #     duration = $NUMBER*$SIZE*8 / (samplerate *channel *bitwidth) # Example: 60 seconds recording # gst-launch alsasrc num-buffers=240 blocksize=44100 ! mfw_mp3encoder ! filesink location=output.mp3 # # To verify that is correct, do a normal audio playback gst-launch filesrc location=output.mp3 typefind=true ! beepdec ! audioconvert ! 'audio/x-raw-int,channels=2' ! alsasink Video, from a test source gst-launch videotestsrc ! queue ! vpuenc ! matroskamux ! filesink location=./test.avi Video, from a file gst-launch filesrc location=sample.yuv blocksize=$BLOCK_SIZE ! 'video/x-raw-yuv,format=(fourcc)I420, width=$WIDTH, height=$HEIGHT, framerate=(fraction)30/1' ! vpuenc codec=$CODEC ! matroskamux ! filesink location=output.mkv sync=false # where #     BLOCK_SIZE = WIDTH * HEIGHT * 1.5 #     CODEC = 0(MPEG4), 5(H263), 6(H264) or 12(MJPG). # # For example, encoding a CIF raw file gst-launch filesrc location=sample.yuv blocksize=152064 ! 'video/x-raw-yuv,format=(fourcc)I420, width=352, height=288, framerate=(fraction)30/1' ! vpuenc codec=0 ! matroskamux ! filesink location=sample.mkv sync=false Video, from Web camera # when the web cam is connected, the device node /dev/video0 should be present. In order to test the camera, without encoding gst-launch v4l2src ! mfw_v4lsink # in recording, run: # gst-launch v4l2src num-buffers=-1 ! queue max-size-buffers=2 ! vpuenc codec=0 ! matroskamux ! filesink location=output.mkv sync=false # # where sync=false indicates filesink to to use a clock sync # # In case a specific width/height is needed, just add the filter caps gst-launch v4l2src num-buffers=-1  ! 'video/x-raw-yuv,format=(fourcc)I420, width=352, height=288, framerate=(fraction)30/1' ! queue ! vpuenc codec=0 ! matroskamux ! filesink location=output.mkv sync=false # # In case you want to see in the screen what the camera is capturing, add a tee element # gst-launch v4l2src num-buffers=-1 ! tee name=t ! queue ! mfw_v4lsink t. ! queue ! vpuenc codec=0 ! matroskamux ! filesink location=output.mkv sync=false Video, from Parallel/MIPI camera # The camera driver needs to be loaded before executing the pipeline, refer to the BSP document to see which driver to load # MIPI (J5 port): modprobe ov5640_camera_mipi modprobe mxc_v4l2_capture   # Parallel (J9 port): modprobe ov5642_camera modprobe mxc_v4l2_capture   gst-launch mfw_v4lsrc ! queue ! vpuenc codec=0 ! matroskamux ! filesink location=output.mkv sync=false   # Do a 'gst-inspect mfw_v4lsrc' or 'gst-inspect vpuenc' to see other possible settings (resolution, fps, codec, etc.)
View full article