i.MX Processors Knowledge Base

cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 

i.MX Processors Knowledge Base

Discussions

Sort by:
Notes: First run the playback pipeline then the streaming pipeline. The above example streams H263 video and AMR audio data. Change codec format to your needs. In case where the iMX is the streaming machine, the audio encoder 'amrnbenc' must be installed before. This scenario has not been tested Shell variables and pipelines Playback machine (receiver) # On playback machine, set either IMX2PC or PC2IMX variables, then run the pipeline ## IMX2PC: Case where PC does the playback     AUDIO_DEC_SINK="rtpamrdepay ! amrnbdec ! alsasink "     VIDEO_CAPS="\"application/x-rtp,media=(string)video,clock-rate=(int)90000,encoding-name=(string)H263-1998\""     VIDEO_DEC_SINK="rtph263pdepay ! ffdec_h263 ! autovideosink" ## End of IMX2PC Settings ## PC2IMX: Case where iMX does the playback     AUDIO_DEC_SINK="rtpamrdepay ! mfw_amrdecoder ! alsasink "     VIDEO_CAPS="\"application/x-rtp,media=(string)video,clock-rate=(int)90000,encoding-name=(string)H263-1998\""     VIDEO_DEC_SINK="rtph263pdepay ! vpudec ! mfw_v4lsink " ## End of PC2IMX Settings PLAYBACK_AUDIO="udpsrc caps=\"application/x-rtp,media=(string)audio,clock-rate=(int)8000,encoding-name=(string)AMR,encoding-params=(string)1,octet-align=(string)1\" \             port=5002 ! rtpbin.recv_rtp_sink_1 \         rtpbin. ! $AUDIO_DEC_SINK \      udpsrc port=5003 ! rtpbin.recv_rtcp_sink_1 \      rtpbin.send_rtcp_src_1 ! udpsink port=5007 sync=false async=false" PLAYBACK_VIDEO="udpsrc caps=$VIDEO_CAPS port=5000 ! rtpbin.recv_rtp_sink_0 \         rtpbin. ! $VIDEO_DEC_SINK \         udpsrc port=5001 ! rtpbin.recv_rtcp_sink_0 \         rtpbin.send_rtcp_src_0 ! udpsink port=5005 sync=false async=false" PLAYBACK_AV="$PLAYBACK_VIDEO $PLAYBACK_AUDIO" # Playback pipeline gst-launch -v gstrtpbin name=rtpbin $PLAYBACK_AV Streaming Machine (sender) # On Streaming machine, set either IMX2PC or PC2IMX variables, then run the pipeline ## IMX2PC: Case where iMX does the streaming     IP=x.x.x.x # IP address of the playback machine     VIDEO_SRC="mfw_v4lsrc"     VIDEO_ENC="vpuenc codec=h263 ! rtph263ppay "    AUDIO_ENC="audiotestsrc ! amrnbenc ! rtpamrpay " ## END IMX2PC settings ## PC2IMX: Case where PC does the streaming     IP=y.y.y.y # IP address of the playback machine     VIDEO_SRC="v4l2src"     VIDEO_ENC="ffenc_h263 ! rtph263ppay "     AUDIO_ENC="audiotestsrc ! amrnbenc ! rtpamrpay " # END PC2PC settings STREAM_AUDIO="$AUDIO_ENC ! rtpbin.send_rtp_sink_1 \         rtpbin.send_rtp_src_1 ! udpsink host=$IP port=5002 \         rtpbin.send_rtcp_src_1 ! udpsink host=$IP port=5003 sync=false async=false \         udpsrc port=5007 ! rtpbin.recv_rtcp_sink_1" STREAM_VIDEO="$VIDEO_SRC ! $VIDEO_ENC ! rtpbin.send_rtp_sink_0 \         rtpbin.send_rtp_src_0 ! queue ! udpsink host=$IP port=5000 \         rtpbin.send_rtcp_src_0 ! udpsink host=$IP port=5001 sync=false async=false \         udpsrc port=5005 ! rtpbin.recv_rtcp_sink_0" STREAM_AV="$STREAM_VIDEO $STREAM_AUDIO" # Stream pipeline gst-launch -v gstrtpbin name=rtpbin $STREAM_AV
View full article
Notes: + Run the pipelines in the presented order + The above example streams H263 video. + the gl command is equal to 'gst-launch' (two instead of 'gst-launch'.size() chars ) + Pending work: H264 test cases and other scenarios. Scenario Shell variables and pipelines # Export always these variables on the i.MX export VSALPHA=1 export WIDTH=320 export HEIGHT=240 export SEP=20 # decoded and displayed Uni-directional: from PC to i.MX. PC is streaming 4 H.263 streams and i.MX displays all in the screen. # On i.MX (Target) gl udpsrc caps='application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H263' port=8890 ! rtph263depay ! vpudec ! mfw_isink sync=false axis-top=0 axis-left=0 disp-width=$WIDTH disp-height=$HEIGHT & gl udpsrc caps='application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H263' port=8891 ! rtph263depay ! vpudec ! mfw_isink sync=false axis-top=0 axis-left=`expr $WIDTH + $SEP` disp-width=$WIDTH disp-height=$HEIGHT & gl udpsrc caps='application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H263' port=8892 ! rtph263depay ! vpudec ! mfw_isink sync=false axis-top=`expr $HEIGHT + $SEP` axis-left=0   disp-width=$WIDTH disp-height=$HEIGHT & gl udpsrc caps='application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H263' port=8893 ! rtph263depay ! vpudec ! mfw_isink sync=false axis-top=`expr $HEIGHT + $SEP` axis-left=`expr $WIDTH + $SEP` disp-width=$WIDTH disp-height=$HEIGHT & # On PC (Source) export IP_iMX= # Place the IP address of the i.MX board gst-launch -v videotestsrc ! ffenc_h263 ! rtph263pay ! multiudpsink clients=IP_iMX:8890,IP_iMX:8891,IP_iMX:8892,$IP_iMX:8893 Uni-directional: from PC to i.MX. PC is streaming one H.264 stream and i.MX displays it on the screen # On i.MX (Target) # Make sure you set the caps correctly, specially the sprop-parameter-sets cap. The one show below is just an example and works with the source file sintel_trailer-1080p.mp4 export VSALPHA=1 GST_DEBUG=*:2 gst-launch -v udpsrc caps='application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, sprop-parameter-sets=(string)\"Z2QAMqw05gHgCJ+WEAAAAwAQAAADAwDxgxmg\\,aOl4TLIs\", payload=(int)96' port=8890 ! rtph264depay ! vpudec ! mfw_isink sync=false # On PC (Source) gst-launch -v filesrc location=sintel_trailer-1080p.mp4 typefind=true ! qtdemux ! rtph264pay ! multiudpsink clients=10.112.102.168:8890 Bi-directional: PC is streaming 4 H.263 streams to i.MX, iMX displays it and sends the four back to PC # On i.MX export IP_PC= # Place the IP address of the PC host machine gl -v udpsrc caps='application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H263' port=8890 ! rtph263depay ! vpudec ! tee name=t ! queue ! mfw_isink sync=false axis-top=0 axis-left=0 disp-width=$WIDTH disp-height=$HEIGHT t. ! queue ! vpuenc codec=5 ! rtph263pay ! udpsink host=$IP_PC port=9990 & gl -v udpsrc caps='application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H263' port=8891 ! rtph263depay ! vpudec ! tee name=t ! queue ! mfw_isink sync=false axis-top=0 axis-left=`expr $WIDTH + $SEP` disp-width=$WIDTH disp-height=$HEIGHT t. ! queue ! vpuenc codec=5 ! rtph263pay ! udpsink host=$IP_PC port=9991 & gl -v udpsrc caps='application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H263' port=8892 ! rtph263depay ! vpudec ! tee name=t ! queue ! mfw_isink sync=false axis-top=`expr $HEIGHT + $SEP` axis-left=0   disp-width=$WIDTH disp-height=$HEIGHT t. ! queue ! vpuenc codec=5 ! rtph263pay ! udpsink host=$IP_PC port=9992 & gl -v udpsrc caps='application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H263' port=8893 ! rtph263depay ! vpudec ! tee name=t ! queue ! mfw_isink sync=false axis-top=`expr $HEIGHT + $SEP` axis-left=`expr $WIDTH + $SEP` disp-width=$WIDTH disp-height=$HEIGHT t. ! queue ! vpuenc codec=5 ! rtph263pay ! udpsink host=$IP_PC port=9993 & # On PC ## Stream received from iMX export IP_iMX= # Place the IP address of the i.MX board gl -v udpsrc caps='application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H263' port=9990 ! rtph263depay ! ffdec_h263 ! xvimagesink & gl -v udpsrc caps='application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H263' port=9991 ! rtph263depay ! ffdec_h263 ! xvimagesink & gl -v udpsrc caps='application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H263' port=9992 ! rtph263depay ! ffdec_h263 ! xvimagesink & gl -v udpsrc caps='application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H263' port=9993 ! rtph263depay ! ffdec_h263 ! xvimagesink & ## Stream sent to iMX gl -v videotestsrc ! videoscale ! video/x-raw-yuv,width=\(int\)1408,height=\(int\)1152 !  ffenc_h263 ! rtph263pay ! udpsink host=$IP_iMX port=8890 & gl -v videotestsrc ! videoscale ! video/x-raw-yuv,width=\(int\)1408,height=\(int\)1152 ! ffenc_h263 ! rtph263pay ! udpsink host=$IP_iMX port=8891 & gl -v videotestsrc ! videoscale ! video/x-raw-yuv,width=\(int\)1408,height=\(int\)1152 ! ffenc_h263 ! rtph263pay ! udpsink host=$IP_iMX port=8892 & gl -v videotestsrc ! videoscale ! video/x-raw-yuv,width=\(int\)1408,height=\(int\)1152 ! ffenc_h263 ! rtph263pay ! udpsink host=$IP_iMX port=8893 &
View full article
GStreamer has a simple feature to enable tracing, allowing the developer to do basic debugging. These can be done in two ways: Adding the parameter --gst-debug=LIST to the pipeline (a pipeline is a executed gst-launch command) Prepending the environment variable GST_DEBUG=LIST' LIST is a a comma-separated argument, indicating the GStreamer elements to trace. For example, if one needs to trace the sink element      $ GST_DEBUG=*sink*:5 gst-launch playbin2 uri=file:///sample.avi or      $ gst-launch playbin2 uri=file:///sample.avi --gst-debug=*sink*:5 Both commands produces the same log. In case want to trace for than one element, so can simple add the <element>:5, for example      $ GST_DEBUG=mfw_v4lsink:5,vpudec:5 gst-launch playbin2 uri=file:///sample.avi The number 5 indicates the log category, where 5 is the highest (the most verbose log you can get) and 0 produces no output (5=LOG, 4=DEBUG, 3=INFO, 2=WARN, 1=ERROR). Log can be huge in each pipeline run. One way to filter it is using the grep command. Before grepping, one needs to redirect the standard error to the standard output (GStreamer log goes always to stderr), so      $ GST_DEBUG=mfw_v4lsink:5,vpudec:5 gst-launch playbin2 uri=file:///sample.avi 2>&1 | grep <filter string> In case the log needs to be shared, it is important to remove the 'color' of the log, again, one just needs to add the parameter --gst-debug-no-color or prepend the env variable GST_DEBUG_NO_COLOR=1 ----- More shell variables that GStreamer react, can be found here https://developer.gnome.org/gstreamer/0.10/gst-running.html
View full article
Overview: This document is written for Freescale customers who have Freescale AC3 release packages (excluded package). (If you did not have the AC3 release package, you can disregard this document.) Freescale OMX Player in Android release supports audio track selection when playing files with multiple audio tracks. However, most customers don't use this enhanced API to select the audio track even if current audio codec is not supported. To avoid a soundless output when partial audio track can be played, this document provides the method to select the available audio track automatically to play. The patch in this document is not included in our current release because it did not match with our track selection rule - play the first track. If you have any idea with this issue, feel free to add comments into this document. Issue description: Software: R13.4-GA or R13.4.1 Android releases Hardware: MX6Dual/Quad SabreSD board Test source: 1.mkv Test Step: 1. Lunch Gallery from main menu. 2. Play the video And you can see the watch the video without any sound Root Reason: The file has 2 audio track DTS & AC3: audio track 1 is DTS and track 2 is AC3. OMX Player will choose the first audio track to play as default audio track, which is DTS audio. However, the software only supports the AC3 audio codec, so it could not set up audio decoder for DTS track. If we choose to play the AC3 track, sounds could be heard. How to fix: The audio track index is set in GMPlayer::LoadParser(). You can get audio format to check whether it is supported by decoder. Please see the patch audio_track_slection.diff
View full article
gst-inspect is a tool to to get documentation about GStreamer elements. Pipeline Check installed GST elements gst-inspect | tail -1 Check installed FSL GST elements gst-inspect | grep imx Element documentation gst-inspect <gst element>
View full article
The i.MX 8QXP MEK does not allow the OV5640/LVDS/LCD usage only by changing the device tree anymore. It occurs because the M4 owns the i2c resources, so the A core must use rpmsg to enable virtual drivers. Due to this, if the user changes the device tree, for instance, the *ov5640.dtb, the kernel won't boot, entering in the following loop: [    8.603353] [drm] Supports vblank timestamp caching Rev 2 (21.10.2013).      [    8.610025] [drm] No driver support for vblank timestamp query.              [    8.616077] imx-drm display-subsystem: bound imx-drm-dpu-bliteng.2 (ops dpu_) [    8.624978] imx-drm display-subsystem: bound imx-dpu-crtc.0 (ops dpu_crtc_op) [    8.632526] imx-drm display-subsystem: bound imx-dpu-crtc.1 (ops dpu_crtc_op) [    8.639833] imx-drm display-subsystem: failed to bind ldb@562210e0 (ops imx_7 [    8.648428] imx-drm display-subsystem: master bind failed: -517 With the approach provided in this post, it is possible to make this change manually, only by changing the flash.bin at U-boot for a non-m4 one. In order to make the changes to the flash.bin file, it’s needed to obtain the following files: - u-boot.bin from internal u-boot provided by NXP. - scfw_tcm.bin from SCFW porting kit - bl31.bin from ARM Trusted Firmware - SECO firmware container image Disclaimer The described procedures in this document target a GNU/Linux (Ubuntu 20.04 LTS) and it’s focused on iMX8QXP B0 + BSP L4.19.35_1.1.0. Required packages 1 - Install ARM64 ToolChain: 1.1 - Install ARM64 GCC and G++ cross-compilers: # apt install gcc-aarch64-linux-gnu g++-aarch64-linux-gnu 2 - Install ARM32 GCC6 ToolChain: 2.1 - Download the ARM32 6 Toolchain and install it: $ mkdir ~/gcc_toolchain $ cp ~/Downloads/gcc-arm-none-eabi-6-2017-q2-update-linux.tar.bz2 ~/gcc_toolchain/ $ cd ~/gcc_toolchain/ $ tar xvjf gcc-arm-none-eabi-6-2017-q2-update-linux.tar.bz2 # apt-get update # apt-get install srecord 3 - Download MKimage 3.1 - Create a new directory desired to the packages: $ mkdir flash_build $ cp flash_build 3.1 - Clone the MKimage: $ git clone https://source.codeaurora.org/external/imx/imx-mkimage -b imx_4.19.35_1.1.0 4 - U-boot build 4.1 - Clone the U-boot  $ git clone https://source.codeaurora.org/external/imx/uboot-imx -b imx_v2019.04_4.19.35_1.1.0 $ cd uboot-imx 4.2 - Export the ARM64 ToolChain:  $ export ARCH=arm64 $ export CROSS_COMPILE=/usr/bin/aarch64-linux-gnu- 4.3 - Build it:  $ unset LDFLAGS $ make -j4 imx8qxp_mek_defconfig $ make 4.4 - Copy the binary files to the MKimage/iMX8QX directory:  $ cp spl/u-boot-spl.bin ../imx-mkimage/iMX8QX/ $ cp u-boot-nodtb.bin ../imx-mkimage/iMX8QX/ $ cd ..   5 - ARM Trusted Firmware 5.1 - Clone the imx-atf:  $ git clone https://source.codeaurora.org/external/imx/imx-atf -b imx_4.19.35_1.1.0 $ cd imx-atf 5.2 - Build it:  $ unset LDFLAGS $ make PLAT=imx8qx bl31 5.3 - Copy the binary files to the MKimage/iMX8QX directory:  $ cp build/imx8qx/release/bl31.bin ../imx-mkimage/iMX8QX/ $ cd ..   6 - SCFW 6.1 - Export the ARM32 GCC6 Toolchain:  $ export TOOLS=~/gcc_toolchain/ 6.2 - Download the BSP L4.19.35_1.1.0_SCFW and copy it to the flash_build directory:  $ cp ~/Downloads/imx-scfw-porting-kit-1.2.7.1.tar.gz $ tar xvzf imx-scfw-porting-kit-1.2.7.1.tar.gz $ cd packages/ $ chmod a+x imx-scfw-porting-kit-1.2.7.1.tar.gz $ ./imx-scfw-porting-kit-1.2.7.1.bin 6.3 - Build it to i.MX 8QXP MEK B0:  $ cd imx-scfw-porting-kit-1.2.7.1/src/ $ tar xvzf scfw_export_mx8qx_b0.tar.gz $ cd scfw_export_mx8qx_b0/ $ make qx R=B0 B=mek 6.4 - Copy the binary file to the MKimage/iMX8QX directory:  $ cp build_mx8qx_b0/scfw_tcm.bin ../../../../imx-mkimage/iMX8QX/ $ cp ../../../../ 7 - SECO Firmware Container Image 7.1 - Download the SECO firmware binaries and copy it to the flash_build directory $ cp ~/Downloads/firmware-imx-7.9.bin . $ chmod a+x firmware-imx-7.9.bin 7.2 - Copy the binary files to the MKimage/iMX8QX directory:  $ cp firmware-imx-7.9/firmware/seco/mx8qx-ahab-container.img /imx-mkimage/iMX8QX/ 8 - Build flash.bin 8.1 - In a new terminal, open the imx-mkimage directory: $ cd flash_build/imx-mkimage 8.2 - Build it:  $ make SOC=iMX8QX flash 8.3 - Deploy it to the SDCard:  $ sudo dd if=iMX8QX/flash.bin of=/dev/sdX bs=1k seek=32 && sync Now, you are able to use any non-rpmsg.dtb without kernel errors. Author: Pedro Jardim: pedro.jardim@nxp.com
View full article
Audio, from a file gst-launch filesrc location=test.wav ! wavparse ! mfw_mp3encoder ! filesink location=output.mp3 Audio Recording gst-launch alsasrc num-buffers=$NUMBER blocksize=$SIZE ! mfw_mp3encoder ! filesink location=output.mp3 # where #     duration = $NUMBER*$SIZE*8 / (samplerate *channel *bitwidth) # Example: 60 seconds recording # gst-launch alsasrc num-buffers=240 blocksize=44100 ! mfw_mp3encoder ! filesink location=output.mp3 # # To verify that is correct, do a normal audio playback gst-launch filesrc location=output.mp3 typefind=true ! beepdec ! audioconvert ! 'audio/x-raw-int,channels=2' ! alsasink Video, from a test source gst-launch videotestsrc ! queue ! vpuenc ! matroskamux ! filesink location=./test.avi Video, from a file gst-launch filesrc location=sample.yuv blocksize=$BLOCK_SIZE ! 'video/x-raw-yuv,format=(fourcc)I420, width=$WIDTH, height=$HEIGHT, framerate=(fraction)30/1' ! vpuenc codec=$CODEC ! matroskamux ! filesink location=output.mkv sync=false # where #     BLOCK_SIZE = WIDTH * HEIGHT * 1.5 #     CODEC = 0(MPEG4), 5(H263), 6(H264) or 12(MJPG). # # For example, encoding a CIF raw file gst-launch filesrc location=sample.yuv blocksize=152064 ! 'video/x-raw-yuv,format=(fourcc)I420, width=352, height=288, framerate=(fraction)30/1' ! vpuenc codec=0 ! matroskamux ! filesink location=sample.mkv sync=false Video, from Web camera # when the web cam is connected, the device node /dev/video0 should be present. In order to test the camera, without encoding gst-launch v4l2src ! mfw_v4lsink # in recording, run: # gst-launch v4l2src num-buffers=-1 ! queue max-size-buffers=2 ! vpuenc codec=0 ! matroskamux ! filesink location=output.mkv sync=false # # where sync=false indicates filesink to to use a clock sync # # In case a specific width/height is needed, just add the filter caps gst-launch v4l2src num-buffers=-1  ! 'video/x-raw-yuv,format=(fourcc)I420, width=352, height=288, framerate=(fraction)30/1' ! queue ! vpuenc codec=0 ! matroskamux ! filesink location=output.mkv sync=false # # In case you want to see in the screen what the camera is capturing, add a tee element # gst-launch v4l2src num-buffers=-1 ! tee name=t ! queue ! mfw_v4lsink t. ! queue ! vpuenc codec=0 ! matroskamux ! filesink location=output.mkv sync=false Video, from Parallel/MIPI camera # The camera driver needs to be loaded before executing the pipeline, refer to the BSP document to see which driver to load # MIPI (J5 port): modprobe ov5640_camera_mipi modprobe mxc_v4l2_capture   # Parallel (J9 port): modprobe ov5642_camera modprobe mxc_v4l2_capture   gst-launch mfw_v4lsrc ! queue ! vpuenc codec=0 ! matroskamux ! filesink location=output.mkv sync=false   # Do a 'gst-inspect mfw_v4lsrc' or 'gst-inspect vpuenc' to see other possible settings (resolution, fps, codec, etc.)
View full article
Recently I published this i.MX Dev Blog post about the Gateworks plugin gst-variable-rtsp-server support for i.MX 6. Now, you can check how to use it on i.MX 8 SoCs as well. 1. Preparing the image In order to use gst-variable-rtsp-server plugin, prepare your machine and distro: Add the following line to conf/local.conf: IMAGE_INSTALL_append += "gstreamer1.0-rtsp-server gst-variable-rtsp-server" Download the attached patch and apply it by doing: $ cd <yocto_path>/sources/meta-fsl-bsp-release/ $ git am ~/Download/0001-Add-RTSP-support-for-i.MX-8-L4.14.78_ga1.0.0-or-olde.patch Note: This patch is not necessary for L4.14.98_ga2.0.0 BSP! Then, build the image with bitbake and deploy it to the SD card. 2. Video Test Source Example Server $ gst-variable-rtsp-server -p 9001 -u "videotestsrc ! v4l2h264enc ! rtph264pay name=pay0 pt=96" Client 2. Camera Example Server $ gst-variable-rtsp-server -p 9001 -u "v4l2src device=/dev/video0 ! video/x-raw,width=640,height=480 ! v4l2h264enc ! rtph264pay name=pay0 pt=96" Client In order to use VLC or other application as the client, just enter the URL as shown in the image below:
View full article
This document describes the i.MX 8MQ EVK HDMI output and mini-SAS connectors features on Linux and Android use cases, covering the supported daughter-board, the process to change Device Tree (DTS) files or Boot Images, and enable these different display options on the board.
View full article
Qt framework Qt is a cross-platform complete development framework with tools designed to streamline the creation of stunning native applications and amazing user interfaces for desktop, embedded and mobile platforms. Qt's cross-platform full framework and tools enables developers to target various desktop, embedded, mobile and real-time operating systems with one code base. Qt brings freedom to the developer saving development time, adding efficiency and ultimately shortening time to market. Building Qt Compile Qt for i.MX28 Building QT5 for i.MX53 Building QT for i.MX6 Qt on iMX6 Installing tools Installing and Configuring QT Creator (Ubuntu) Qt5 with Qt3D over Wayland rootfs Demos Qt5 Cinematic Experience Demo on i.MX6 Video - IMx 53 Qt5 qt3d demo Qt5 with Qt3D over Wayland rootfs Information Qt5 on i.MX6  DO's and DONT's Best Practices for QML
View full article
Multiple-Overlay (or Multi-Overlay) means several video playbacks on a single screen. In case multiple screens are needed, check the dual-display case GStreamer i.MX6 Multi-Display $ export VSALPHA=1 $ SAMPLE1=sample1.avi; SAMPLE2=sample2.avi; SAMPLE3=sample3.avi; SAMPLE4=sample4.avi; $ WIDTH=320; HEIGHT=240; SEP=20 Four displays (2x2) $gst-launch \ playbin2 uri=file://`pwd`/$SAMPLE1 video-sink="mfw_isink axis-top=0 axis-left=0   disp-width=$WIDTH disp-height=$HEIGHT" \ playbin2 uri=file://`pwd`/$SAMPLE2 video-sink="mfw_isink axis-top=0 axis-left=`expr $WIDTH + $SEP` disp-width=$WIDTH disp-height=$HEIGHT" \ playbin2 uri=file://`pwd`/$SAMPLE3 video-sink="mfw_isink axis-top=`expr $HEIGHT + $SEP` axis-left=0   disp-width=$WIDTH disp-height=$HEIGHT" \ playbin2 uri=file://`pwd`/$SAMPLE4 video-sink="mfw_isink axis-top=`expr $HEIGHT + $SEP` axis-left=`expr $WIDTH + $SEP` disp-width=$WIDTH disp-height=$HEIGHT" Basic rotation, (2 x 1, normal and inverted) gst-launch \ playbin2 uri=file://`pwd`/$SAMPLE1 video-sink="mfw_isink axis-top=0 axis-left=0   disp-width=$WIDTH disp-height=$HEIGHT rotation=0" \ playbin2 uri=file://`pwd`/$SAMPLE2 video-sink="mfw_isink axis-top=`expr $HEIGHT + $SEP` axis-left=0 disp-width=$WIDTH disp-height=$HEIGHT rotation=3"
View full article
There is no Freescale GStreamer element which does the JPEG decoding, so we must rely on a standard one, like 'jpegdec'. In case your Linux system was built using LTIB, in order to have the jpegdec element included on the gst-plugin-good, follow these steps: On the LTIB menuconfig, make sure the following packages are selected: gstreamer-plugins-good libjpeg libpng Remove the configure parameters '--disbale-libpng' and '--disable-jpeg' on the file './dist/lfs-5.1/gst-plugins-good/gst-plugins-good.spec' Rebuild and flash your board (or SD card) again. Image display VSALPHA=1 gst-launch filesrc location=sample.jpeg ! jpegdec ! imagefreeze ! mfw_isink Important: non 8 pixel aligned width and height is treated as not supported format in isink plugin.
View full article
If you want to use a USB camera (these types of cameras are also called 'Web Cameras') with GStreamer on i.MX6 devices (Linux Kernel version >= 3.035), you need to either load the module dynamically or compile and link statically selecting (Y) the following config on the Kernel configuration      Device Drivers -> Multimedia support -> Video capture adapters -> V4L USB devices -> <*> USB Video Class (UVC) After the Kernel image has been built, flash it into the target, plug the web cam, then on a (target) terminal run      gst-launch v4l2src ! mfw_v4lsink You should see what the camera is capturing on the display. In case you need to encode the camera src data, you need to place the encoder into the pipeline      gst-launch v4l2src num-buffers=100  ! queue ! vpuenc codec=0 ! matroskamux ! filesink location=output.mkv sync=false We are using a certain codec (codec=0 means mpeg4), check options using 'gst-inspect vpuenc'.
View full article
Note: All these gstreamer pipelines have been tested using a i.MX6Q board with a kernel version 3.0.35-2026-geaaf30e. Tools: gst-launch gst-inspect FSL Pipeline Examples: GStreamer i.MX6 Decoding GStreamer i.MX6 Encoding GStreamer Transcoding and Scaling GStreamer i.MX6 Multi-Display GStreamer i.MX6 Multi-Overlay GStreamer i.MX6 Camera Streaming GStreamer RTP Streaming Other plugins: GStreamer ffmpeg GStreamer i.MX6 Image Capture GStreamer i.MX6 Image Display Misc: Testing GStreamer Tracing GStreamer Pipelines GStreamer miscellaneous
View full article
Dumping the pipeline elements into a image file # On target, run the pipeline $ export GST_DEBUG_DUMP_DOT_DIR=<folder where dot files are created> $ gst-launch playbin2 uri=file://${avi} $ # Move the .dot files to a host machine (scp, etc) # On Host dot <dot file> -Tpng -o out.png # dot command is part the the graphviz package Querying which elements are being used on a gst-launch command GST_DEBUG=GST_ELEMENT_FACTORY:3 gst-launch playbin2 uri=file://`pwd`/<media file> Interrupting a gst-launch process running in the background kill -INT $PID # where $PID is the process ID Using only SW codecs # Backup and remove $ find /usr/lib/gstreamer-0.10 -name "libmfw*" | grep -v sink | xargs tar cvf /libmfw_gst.tar $ find /usr/lib/gstreamer-0.10 -name "libmfw*" | grep -v sink | xargs rm # Run your pipeline. This time SW codecs are used $ gst-launch playbin2 uri=file://`pwd`/media_file # To 'install' FSL plugins again, just untar the file $ cd / && tar xvf libmfw_gst.tar && cd - # then run your pipeline. This time HW codecs are used $ gst-launch playbin2 uri=file://`pwd`/media_file
View full article
Video, bad performance gst-launch filesrc location=test.mp4 typefind=true ! aiurdemux ! vpudec ! mfw_v4lsink Video, better performance gst-launch filesrc location=sample.mp4 typefind=true ! aiurdemux ! queue max-size-time=0 ! vpudec ! mfw_v4lsink # typefind=true allows to 'type find' the source file before negotiating # max-size-time=0 indicates to ignore possible blocking issues # In case of ASF files gst-launch filesrc location=sample.asf typefind=true ! aiurdemux ! queue max-size-time=0 ! mfw_wmvdecoder ! mfw_v4lsink Audio gst-launch filesrc location=sample.mp3  typefind=true ! beepdec ! audioconvert  ! 'audio/x-raw-int, channels=2' ! alsasink Audio with visualization gst-launch filesrc location=sample.mp3 typefind=true ! beepdec ! tee name=t ! queue ! audioconvert  ! 'audio/x-raw-int, channels=2' ! alsasink t. ! queue ! audioconvert ! goom ! autovideoconvert ! autovideosink Video/Audio long version gst-launch filesrc location=sample.avi typefind=true ! aiurdemux name=demux demux. ! queue max-size-buffers=0 max-size-time=0 ! vpudec ! mfw_v4lsink demux. ! queue max-size-buffers=0 max-size-time=0 ! beepdec ! audioconvert ! 'audio/x-raw-int, channels=2' ! alsasink # queue properties, max-size-buffers=0 and max-size-time=0, allows a smoother playback; type 'gst-inspect queue' for more info VA short version gplay sample.avi VA short version gst-launch playbin2 uri=file://<full path to sample file>
View full article
Description       this doc is explain how to develop a audio card driver base on i.MX6 platform. which explain the ASOC architecture struction basic knowledage and then give some sample for the audio driver development like: 1:NXP SGTL5000: NXP i.MX BSP sabrelite board default support it. 2: Wolfson WM8524.    A: 3.0.35 BSP support: i.MX6 setbox BSP support it:(which in elder fsl community link and out of data)    B: 3.14.28 BSP support pls check attachment: 3: Wolfson WM8960.     which include how to add the android middle-layer and driver, pls check attachment. 4: TI TLV320AIC3120      which include how to add the android middle-layer and driver, pls check attachment. 5: TI TLV320AIC3X   Products Product Category NXP Part Number URL MPU i.MX6 Family https://www.nxp.com/products/processors-and-microcontrollers/arm-processors/i-mx-applications-processors/i-mx-6-processors:IMX6X_SERIES   Tools NXP Development Board URL i.MX6 SabreSDP https://www.nxp.com/design/development-boards:EVDEBRDSSYS#/collection=softwaretools&start=0&max=25&query=typeTax%3E%3Et633::archived%3E%3E0::Sub_Asset_Type%3E%3ETSP::deviceTax%3E%3Ec731_c380_c127_c126&sorting=Buy%2FSpecifications.desc&language=en&siblings=false which have a doc MX6X_ASOC_V5-20191115.pdf and related driver sample codes.
View full article
This document describes all the i.MX 8 MIPI-CSI use cases, showing the available cameras and daughter cards supported by the boards, the compatible Device Trees (DTS) files, and how to enable these different camera options on the i.MX 8 boards. Plus, this document describes some Advanced camera use cases too, such as multiples cameras output using imxcompositor_g2d plugin, GStreamer zero-copy pipelines and V4L2 API extra-controls examples.
View full article
[中文翻译版] 见附件   原文链接: https://community.nxp.com/docs/DOC-343528 
View full article
///////////////////////////create device node /dev/galcore///////////////////////////// $home/myandroid/kernel_imx/drivers/mxc/gpu-viv/Kbuild MODULE_NAME ?= galcore /* define node name*/ $home/myandroid/kernel_imx/drivers/mxc/gpu-viv/hal/os/linux/kernel/gc_hal_kernel_linux.h define DEVICE_NAME "galcore" $home/myandroid/kernel_imx/drivers/mxc/gpu-viv/hal/os/linux/kernel/gc_hal_kernel_probe.c drv_init call ret = register_chrdev(major, DEVICE_NAME, &driver_fops); ///////////////////////////////opengles2 functios/////////////////////////////////////////// myandroid/device/fsl-proprietary/gpu-viv/lib/egl/libGLESv2_VIVANTE.so glActiveTexture glBindBuffer ... ... ... //those glxxxxxx call into sub_D40C int __fastcall sub_D40C(int a1, int a2, int a3) //address 0x0000D40C { int result; // r0@1 int v4; int v5; v4 = a2;   v5 = a3;   gcoOS_GetTLS(&v4);  //------------> goto libGAL.so   result = v4;   if ( v4 )     result = *(_DWORD *)(v4 + 36);   return result; } and $home/myandroid/device/fsl-proprietary/gpu-viv/lib/libGAL.so //export function signed int __fastcall gcoOS_GetTLS(void **a1) { ... ... gcoOS_GetTLS v4 = open("/dev/galcore", 2); ... ... } and device node /dev/galcore pass command into module galcore $home/myandroid/kernel_imx/drivers/mxc/gpu-viv/hal/kernel/gc_hal_kernel.c gckKERNEL_Dispatch This document was generated from the following discussion: Share Vivante 3d gc2000 work flow
View full article