i.MX处理器知识库

取消
显示结果 
显示  仅  | 搜索替代 
您的意思是: 

i.MX Processors Knowledge Base

讨论

排序依据:
There is no Freescale GStreamer element which does the JPEG decoding, so we must rely on a standard one, like 'jpegdec'. In case your Linux system was built using LTIB, in order to have the jpegdec element included on the gst-plugin-good, follow these steps: On the LTIB menuconfig, make sure the following packages are selected: gstreamer-plugins-good libjpeg libpng Remove the configure parameters '--disbale-libpng' and '--disable-jpeg' on the file './dist/lfs-5.1/gst-plugins-good/gst-plugins-good.spec' Rebuild and flash your board (or SD card) again. Image display VSALPHA=1 gst-launch filesrc location=sample.jpeg ! jpegdec ! imagefreeze ! mfw_isink Important: non 8 pixel aligned width and height is treated as not supported format in isink plugin.
查看全文
In this article, some experiments are done to verify the capability of i.MX6DQ on video playback under different VPU clocks. 1. Preparation Board: i.MX6DQ SD Bitstream: 1080p sunflower with 40Mbps, it is considered as the toughest H264 clip. The original clip is copied 20 times to generate a new raw video (repeat 20 times of sun-flower clip) and then encapsulate into a mp4 container. This is to remove and minimize the influence of startup workload of gstreamer compared to vpu unit test. Kernels: Generate different kernel with different VPU clock setting: 270MHz, 298MHz, 329MHz, 352MHz, 382MHz. test setting: 1080p content decoding and display with 1080p device. (no resize) 2. Test command for VPU unit test and Gstreamer The tiled format video playback is faster than NV12 format, so in below experiment, we choose tiled format during video playback. Unit test command: (we set the frame rate -a 70, higher than 1080p 60fps HDMI refresh rate)     /unit_tests/mxc_vpu_test.out -D "-i /media/65a78bbd-1608-4d49-bca8-4e009cafac5e/sunflower_2B_2ref_WP_40Mbps.264 -f 2 -y 1 -a 70" Gstreamer command: (free run to get the highest playback speed)     gst-launch filesrc location=/media/65a78bbd-1608-4d49-bca8-4e009cafac5e/sunflower_2B_2ref_WP_40Mbps.mp4 typefind=true ! aiurdemux ! vpudec framedrop=false ! queue max-size-buffers=3 ! mfw_v4lsink sync=false 3. Video playback framerate measurement During test, we enter command "echo performance > /sys/devices/system/cpu/cpu0/cpufreq/scaling_governor" to make sure the CPU always work at highest frequency, so that it can respond to any interrupt quickly. For each testing point with different VPU clock, we do 5 rounds of tests. The max and min values are removed, and the remaining 3 data are averaged to get the final playback framerate. #1 #2 #3 #4 #5 Min Max Avg Dec Playback Dec Playback Dec Playback Dec Playback Dec Playback Playback Playback Playback 270M unit test 57.8 57.3 57.81 57.04 57.78 57.3 57.87 56.15 57.91 55.4 55.4 57.3 56.83 GST 53.76 54.163 54.136 54.273 53.659 53.659 54.273 54.01967 298M unit test 60.97 58.37 60.98 58.55 60.97 57.8 60.94 58.07 60.98 58.65 57.8 58.65 58.33 GST 56.755 49.144 53.271 56.159 56.665 49.144 56.755 55.365 329M unit test 63.8 59.52 63.92 52.63 63.8 58.1 63.82 58.26 63.78 59.34 52.63 59.52 58.56667 GST 57.815 55.857 56.862 58.637 56.703 55.857 58.637 57.12667 352M unit test 65.79 59.63 65.78 59.68 65.78 59.65 66.16 49.21 65.93 57.67 49.21 59.68 58.98333 GST 58.668 59.103 56.419 58.08 58.312 56.419 59.103 58.35333 382M unit test 64.34 56.58 67.8 58.73 67.75 59.68 67.81 59.36 67.77 59.76 56.58 59.76 59.25667 GST 59.753 58.893 58.972 58.273 59.238 58.273 59.753 59.03433 Note: Dec column means the vpu decoding fps, while Playback column means overall playback fps. Some explanation: Why does the Gstreamer performance data still improve while unit test is more flat? On Gstreamer, there is a vpu wrapper which is used to make the vpu api more intuitive to be called. So at first, the overall GST playback performance is constrained by vpu (vpu dec 57.8 fps). And finally, as vpu decoding performance goes to higher than 60fps when vpu clock increases, the constraint becomes the display refresh rate 60fps. The video display overhead of Gstreamer is only about 1 fps, similar to unit test. Based on the test result, we can see that for 352MHz, the overall 1080p video playback on 1080p display can reach ~60fps. Or if time sharing by two pipelines with two displays, we can do 2 x 1080p @ 30fps video playback. However, this experiment is valid for 1080p video playback on 1080p display. If for interlaced clip and display with size not same as 1080p, the overall playback performance is limited by some postprocessing like de-interlacing and resize.
查看全文
1. Set up HDMI Set up your kernel to use HDMI adding the following code to bootargs on u-boot: video=mxcfb0:dev=hdmi,1920x1080M@60,if=RGB24 2. Test raw audio In order to test only raw audio, use the following command: aplay -D hw:1,0 Kaleidoscope.wav 3. Make HDMI audio the default output In order to configure audio output over HDMI, please, replace content of file ~/.asoundrc to the following one pcm.dmix_48000{      type dmix      ipc_key 5678293      ipc_key_add_uid yes      slave{           pcm "hw:1,0"           period_time 0           period_size 2048           buffer_size 24576           format S16_LE           rate 48000      } } pcm.!dsnoop_44100{      type dsnoop      ipc_key 5778293      ipc_key_add_uid yes      slave{           pcm "hw:0,0"           period_time 0           period_size 2048           buffer_size 24576           format S16_LE           rate 44100      } } pcm.!dsnoop_48000{      type dsnoop      ipc_key 5778293      ipc_key_add_uid yes      slave{           pcm "hw:1,0"           period_time 0           period_size 2048           buffer_size 24576           format S16_LE           rate 48000      } } pcm.asymed{      type asym      playback.pcm "dmix_48000"      capture.pcm "dsnoop_44100" } pcm.dsp0{      type plug      slave.pcm "asymed" } pcm.!default{      type plug      route_policy "average"      slave.pcm "asymed" } ctl.mixer0{      type hw      card 0 } This will configure alsa to use sound card hw:1,0. Please, pay attention to use the proper audio card name for your device. In order to see available sound cards on board: root@imx53qsb:~# aplay -l **** List of PLAYBACK Hardware Devices **** card 0: imx3stack [imx-3stack], device 0: SGTL5000 SGTL5000-0 []   Subdevices: 1/1   Subdevice #0: subdevice #0 card 1: imx3stackspdif [imx-3stack-spdif], device 0: IMX SPDIF mxc spdif-0 []   Subdevices: 1/1   Subdevice #0: subdevice #0 For detail on how to create asound.conf, please see alsa-lib configuration introduction. 4. Encoded audio For encoded (i.e. AC3, DTS) audio, you can use, for example, ac3dec, an utility provided by alsa-tools with the following command line: ac3dec -D hw:1,0 -C test.ac3 This would work for both HDMI audio and SPDIF audio. Double check your hardware and/or schematic in order to know which one to use.
查看全文
Notes: + Run the pipelines in the presented order + The above example streams H263 video. + the gl command is equal to 'gst-launch' (two instead of 'gst-launch'.size() chars ) + Pending work: H264 test cases and other scenarios. Scenario Shell variables and pipelines # Export always these variables on the i.MX export VSALPHA=1 export WIDTH=320 export HEIGHT=240 export SEP=20 # decoded and displayed Uni-directional: from PC to i.MX. PC is streaming 4 H.263 streams and i.MX displays all in the screen. # On i.MX (Target) gl udpsrc caps='application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H263' port=8890 ! rtph263depay ! vpudec ! mfw_isink sync=false axis-top=0 axis-left=0 disp-width=$WIDTH disp-height=$HEIGHT & gl udpsrc caps='application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H263' port=8891 ! rtph263depay ! vpudec ! mfw_isink sync=false axis-top=0 axis-left=`expr $WIDTH + $SEP` disp-width=$WIDTH disp-height=$HEIGHT & gl udpsrc caps='application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H263' port=8892 ! rtph263depay ! vpudec ! mfw_isink sync=false axis-top=`expr $HEIGHT + $SEP` axis-left=0   disp-width=$WIDTH disp-height=$HEIGHT & gl udpsrc caps='application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H263' port=8893 ! rtph263depay ! vpudec ! mfw_isink sync=false axis-top=`expr $HEIGHT + $SEP` axis-left=`expr $WIDTH + $SEP` disp-width=$WIDTH disp-height=$HEIGHT & # On PC (Source) export IP_iMX= # Place the IP address of the i.MX board gst-launch -v videotestsrc ! ffenc_h263 ! rtph263pay ! multiudpsink clients=IP_iMX:8890,IP_iMX:8891,IP_iMX:8892,$IP_iMX:8893 Uni-directional: from PC to i.MX. PC is streaming one H.264 stream and i.MX displays it on the screen # On i.MX (Target) # Make sure you set the caps correctly, specially the sprop-parameter-sets cap. The one show below is just an example and works with the source file sintel_trailer-1080p.mp4 export VSALPHA=1 GST_DEBUG=*:2 gst-launch -v udpsrc caps='application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, sprop-parameter-sets=(string)\"Z2QAMqw05gHgCJ+WEAAAAwAQAAADAwDxgxmg\\,aOl4TLIs\", payload=(int)96' port=8890 ! rtph264depay ! vpudec ! mfw_isink sync=false # On PC (Source) gst-launch -v filesrc location=sintel_trailer-1080p.mp4 typefind=true ! qtdemux ! rtph264pay ! multiudpsink clients=10.112.102.168:8890 Bi-directional: PC is streaming 4 H.263 streams to i.MX, iMX displays it and sends the four back to PC # On i.MX export IP_PC= # Place the IP address of the PC host machine gl -v udpsrc caps='application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H263' port=8890 ! rtph263depay ! vpudec ! tee name=t ! queue ! mfw_isink sync=false axis-top=0 axis-left=0 disp-width=$WIDTH disp-height=$HEIGHT t. ! queue ! vpuenc codec=5 ! rtph263pay ! udpsink host=$IP_PC port=9990 & gl -v udpsrc caps='application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H263' port=8891 ! rtph263depay ! vpudec ! tee name=t ! queue ! mfw_isink sync=false axis-top=0 axis-left=`expr $WIDTH + $SEP` disp-width=$WIDTH disp-height=$HEIGHT t. ! queue ! vpuenc codec=5 ! rtph263pay ! udpsink host=$IP_PC port=9991 & gl -v udpsrc caps='application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H263' port=8892 ! rtph263depay ! vpudec ! tee name=t ! queue ! mfw_isink sync=false axis-top=`expr $HEIGHT + $SEP` axis-left=0   disp-width=$WIDTH disp-height=$HEIGHT t. ! queue ! vpuenc codec=5 ! rtph263pay ! udpsink host=$IP_PC port=9992 & gl -v udpsrc caps='application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H263' port=8893 ! rtph263depay ! vpudec ! tee name=t ! queue ! mfw_isink sync=false axis-top=`expr $HEIGHT + $SEP` axis-left=`expr $WIDTH + $SEP` disp-width=$WIDTH disp-height=$HEIGHT t. ! queue ! vpuenc codec=5 ! rtph263pay ! udpsink host=$IP_PC port=9993 & # On PC ## Stream received from iMX export IP_iMX= # Place the IP address of the i.MX board gl -v udpsrc caps='application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H263' port=9990 ! rtph263depay ! ffdec_h263 ! xvimagesink & gl -v udpsrc caps='application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H263' port=9991 ! rtph263depay ! ffdec_h263 ! xvimagesink & gl -v udpsrc caps='application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H263' port=9992 ! rtph263depay ! ffdec_h263 ! xvimagesink & gl -v udpsrc caps='application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H263' port=9993 ! rtph263depay ! ffdec_h263 ! xvimagesink & ## Stream sent to iMX gl -v videotestsrc ! videoscale ! video/x-raw-yuv,width=\(int\)1408,height=\(int\)1152 !  ffenc_h263 ! rtph263pay ! udpsink host=$IP_iMX port=8890 & gl -v videotestsrc ! videoscale ! video/x-raw-yuv,width=\(int\)1408,height=\(int\)1152 ! ffenc_h263 ! rtph263pay ! udpsink host=$IP_iMX port=8891 & gl -v videotestsrc ! videoscale ! video/x-raw-yuv,width=\(int\)1408,height=\(int\)1152 ! ffenc_h263 ! rtph263pay ! udpsink host=$IP_iMX port=8892 & gl -v videotestsrc ! videoscale ! video/x-raw-yuv,width=\(int\)1408,height=\(int\)1152 ! ffenc_h263 ! rtph263pay ! udpsink host=$IP_iMX port=8893 &
查看全文
gst-launch is the tool to execute GStreamer pipelines. Task Pipeline Looking at caps gst-launch -v  <gst elements> Enable log gst-launch --gst-debug=<element>:<level> gst-launch --gst-debug=videotestsrc:5 videotestsrc ! filesink location=/dev/null
查看全文
Test digital zoom with ipu for camera preview.   Board :sarbre-sd (imx6dq) BSP   : android 13.4ga In the above flow, one frame buffer is processed in four steps at camera preview. Add the step to change the frame buffer before step 4 , the added step which  zoom one preview frame.   The figure below shows the crop function of ipu lib, we use this function scale the frame.   Test result: preview zoom levle 0:   preview zoom level max:     When taking pictures with 5M pixels and the zoom is over level 1, the picture size is not 2592x1944 but 2016x1512. The underlying reason for it is that ipu crop function only supports the 2048x2048 maximum output .   Thumbnails of test result :  
查看全文
Dumping the pipeline elements into a image file # On target, run the pipeline $ export GST_DEBUG_DUMP_DOT_DIR=<folder where dot files are created> $ gst-launch playbin2 uri=file://${avi} $ # Move the .dot files to a host machine (scp, etc) # On Host dot <dot file> -Tpng -o out.png # dot command is part the the graphviz package Querying which elements are being used on a gst-launch command GST_DEBUG=GST_ELEMENT_FACTORY:3 gst-launch playbin2 uri=file://`pwd`/<media file> Interrupting a gst-launch process running in the background kill -INT $PID # where $PID is the process ID Using only SW codecs # Backup and remove $ find /usr/lib/gstreamer-0.10 -name "libmfw*" | grep -v sink | xargs tar cvf /libmfw_gst.tar $ find /usr/lib/gstreamer-0.10 -name "libmfw*" | grep -v sink | xargs rm # Run your pipeline. This time SW codecs are used $ gst-launch playbin2 uri=file://`pwd`/media_file # To 'install' FSL plugins again, just untar the file $ cd / && tar xvf libmfw_gst.tar && cd - # then run your pipeline. This time HW codecs are used $ gst-launch playbin2 uri=file://`pwd`/media_file
查看全文
Qt framework Qt is a cross-platform complete development framework with tools designed to streamline the creation of stunning native applications and amazing user interfaces for desktop, embedded and mobile platforms. Qt's cross-platform full framework and tools enables developers to target various desktop, embedded, mobile and real-time operating systems with one code base. Qt brings freedom to the developer saving development time, adding efficiency and ultimately shortening time to market. Building Qt Compile Qt for i.MX28 Building QT5 for i.MX53 Building QT for i.MX6 Qt on iMX6 Installing tools Installing and Configuring QT Creator (Ubuntu) Qt5 with Qt3D over Wayland rootfs Demos Qt5 Cinematic Experience Demo on i.MX6 Video - IMx 53 Qt5 qt3d demo Qt5 with Qt3D over Wayland rootfs Information Qt5 on i.MX6  DO's and DONT's Best Practices for QML
查看全文
This note show how to use the open source gstreamer1.0-rtsp-server package on i.MX6QDS and i.MX8x to stream video files and camera using RTP protocol.  The i.MX 6ULL and i.MX 7 doesn't have Video Processing Unit (VPU). Real Time protocol is a very common network protocol for delivering media over IP networks. On the board, you will need a GStreamer pipeline that encodes the raw video, adds the RTP payload, and sends over a network sink. A generic pipeline would look as follows: video source ! video encoder ! RTP payload ! network sink Video source: often it is a camera, but it can be a video from a file or a test pattern, for example. Video encoder: a video encoder as H.264, H.265, VP8, JPEG and others. RTP payload: an RTP payload that matches the video encoder. Network sink: a video sync that streams over the network, often via UDP.   Prerequisites: MX6x o MX8x board with the L5.10.35 BSP installed. A host PC with either Gstreamer or VLC player installed. Receiving h.264/h.265 Encoded RTP Video Stream on a Host Machine Using GStreamer GStreamer is a low-latency method for receiving RTP video. On your host machine, install Gstreamer and send the following command: $ gst-launch-1.0 -v udpsrc port=5000 caps = "application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, payload=(int)96" ! rtph264depay ! decodebin ! videoconvert ! autovideosink sync=false   Using Host PC: VLC Player Optionally, you can use VLC player to receive RTP video on a PC. First, in your PC, create a sdp file with the following content:  stream.sdpv=0m=video 5000 RTP/AVP 96c=IN IP4 127.0.0.1a=rtpmap:96 H264/90000 After this, with the GStreamer pipepline on the device running, open this .sdp file with VLC Player on the host PC. Sending h.264 and h.265 Encoded RTP Video Stream GStreamer provides an h.264 encoding element by software named x264enc. Use this plugin if your board does not support h.264 encoding by hardware or if you want to use the same pipeline on different modules. Note that the video performance will be lower compared with the plugins with encoding accelerated by hardware. # gst-launch-1.0 videotestsrc ! videoconvert ! x264enc ! rtph264pay config-interval=1 pt=96 ! udpsink host=<host-machine-ip> port=5000 Note: Replace <host-machine-ip> by the IP of the host machine. In all examples you can replace videotestsrc by v4l2src element to collect a stream from a camera   i.MX8X # gst-launch-1.0 videotestsrc ! videoconvert ! v4l2h264enc ! rtph264pay config-interval=1 pt=96 ! udpsink host=<host-machine-ip> port=5000   i.MX 8M Mini Quad/ 8M Plus # gst-launch-1.0 videotestsrc ! videoconvert ! vpuenc_h264 ! rtph264pay config-interval=1 pt=96 ! udpsink host=<host-machine-ip> port=5000 i.MX6X The i.MX6QDS does not support h.265 so the h.264 can work: # gst-launch-1.0 videotestsrc ! videoconvert ! vpuenc_h264 ! rtph264pay config-interval=1 pt=96 ! udpsink host=<host-machine-ip> port=5000   Using Other Video Encoders While examples of streaming video with other encoders are not provided, you may try it yourself. Use the gst-inspect tool to find available encoders and RTP payloaders on the board: # gst-inspect-1.0 | grep -e "encoder"# gst-inspect-1.0 | grep -e "rtp" -e " payloader" Then browse the results and replace the elements in the original pipelines. On the receiving end, you will have to use a corresponding payload. Inspect the payloader element to find the corresponding values. For example: # gst-inspect-1.0 rtph264pay   Install rtp in your yocto different form L5.10.35 BSP, to install gstreamer1.0-rtsp-server in any Yocto Project image, please follow the steps below: Enable meta-multimedia layer: Add the following on your build/conf/bblayers.conf: BBLAYERS += "$"${BSPDIR}/sources/meta-openembedded/meta-multimedia" Include gstreamer1.0-rtsp-server into the image: Add the following on your build/conf/local.conf: IMAGE_INSTALL_append += "gstreamer1.0-rtsp-server" Run bitbake and mount your sdcard. Copy the binaries: Access the gstreamer1.0-rtsp-server examples folder: $ cd /build/tmp/work/cortexa9hf-vfp-neon-poky-linux-gnueabi/gstreamer1.0-rtsp-server/$version/build/examples/.libs Copy the test-uri and test-launch to the rootfs /usr/bin folder. $ sudo cp test-uri test-launch /media/USER/ROOTFS_PATH/usr/bin Be sure that the IPs are correctly set: SERVER: => ifconfig eth0 $SERVERIP CLIENT: => ifconfig eth0 $CLIENTIP Video file example SERVER: => test-uri file:///home/root/video_file.mp4 CLIENT: => gst-launch-1.0 playbin uri=rtsp://$SERVERIP:8554/test You can try to improve the framerate performance using manual pipelines in the CLIENT with the rtspsrc plugin instead of playbin. Follow an example: => gst-launch-1.0 rtspsrc location=rtsp://$SERVERIP:8554/test caps = 'application/x-rtp'  ! queue max-size-buffers=0 ! rtpjitterbuffer latency=100 ! queue max-size-buffers=0 ! rtph264depay ! queue max-size-buffers=0 ! decodebin ! queue max-size-buffers=0 ! imxv4l2sink sync=false   Camera example SERVER: => test-launch "( imxv4l2src device=/dev/video0 ! capsfilter caps='video/x-raw, width=1280, height=720, framerate=30/1, mapping=/test' ! vpuenc_h264 ! rtph264pay name=pay0 pt=96 )" CLIENT: => gst-launch-1.0 rtspsrc location=rtsp://$SERVERIP:8554/test ! decodebin ! autovideosink sync=false The rtspsrc has two properties very useful for RTSP streaming: Latency: Useful for low-latency RTSP stream playback (default 200 ms); Buffer-mode: Used to control buffer mode. The slave mode is recommended for low-latency communications. Using these properties, the example below gets 29 FPS without a sync=false property in the sink plugin. The key achievement here is the fact that there is no dropped frame: => gst-launch-1.0 rtspsrc location=rtsp://$SERVERIP:8554/test latency=100 buffer-mode=slave ! queue max-size-buffers=0 ! rtph264depay ! vpudec ! imxv4l2sink      
查看全文
i.MX8 VPU hardware decoder support below video codec: H.265 HEVC Main Profile 4Kp60 Level 5.1 H.264 AVC Constrained Baseline, Main and High profile H.264 MVC WMV9 / VC-1 Simple, Main and Advanced Profile MPEG 1 and 2 Main Profile at High Level AVS Jizhun Profile (JP) MJPEG4.2 ASP, H.263, Sorenson Spark Divx 3.11, with Global Motion Compensation (GMC) ON2/Google VP6/VP8 RealVideo 8/9/10 JPEG and MJPEG A/B Baseline   i.MX8 VPU Linux driver is implemented based on V4L2 standard. Chromium beside software video decoding, it also support hardware video decoder(VideoDecodeAccelerator),  there are some kind of VideoDecodeAccelerator, one of them is V4L2VDA. Please note V4L2VDA is using V4l2 api, so it is possible that change V4L2VDA to enable Chromium hardware video playback on i.MX8.   This doc share patch to add chromium video decode accelerate by using i.MX8QM/i.MX8QXP VPU. It will support chromium H.264, H.265, VP8 hardware video decode. H.264 and H.265 need use mp4 container. VP8 use webm container.   HW: i.MX8QM/i.MX8QXP MEK board, 1080P HDMI display, mouse, keyboard SW: i.MX8 5.10.72_2.2.2 yocto bsp release(which included chromium 91.0), and patch in this doc   Patch description: imx8-5.10.72-vpudrv-update.diff, update i.MX8  5.10.72_2.2.2 kernel vpu driver to https://source.codeaurora.org/external/imx/linux-imx/commit/drivers/mxc/vpu_malone?h=lf-5.15.y&id=fa7c67e2c9ed4fb8392fa258f931d6996339a17a chromium-ozone-wayland_91.0.4472.114.bb.diff, change meta-browser/meta-chromium/recipes-browser/chromium/chromium-ozone-wayland_91.0.4472.114.bb for adding some compile flags, etc. 5.10.72-merge.patch, this patch change chromium source code to add video decode accelerate by using i.MX8 VPU.   Build steps: 1>Download i.MX8 5.10.72_2.2.2 yocto release from nxp.com 2>apply chromium-ozone-wayland_91.0.4472.114.bb.diff to change meta-browser/meta-chromium/recipes-browser/chromium/chromium-ozone-wayland_91.0.4472.114.bb 3>put 5.10.72-merge.patch to folder path_of_yocto-5.10.72-2.2.2/sources/meta-browser/meta-chromium/recipes-browser/chromium/files/ 3>apply imx8-5.10.72-vpudrv-update.diff to i.MX8 5.10.72_2.2.2 kernel 4>under the yocto image build folder, add "CORE_IMAGE_EXTRA_INSTALL += "chromium-ozone-wayland" to file path_of_yocto-5.10.72-2.2.2/folder-of-bld/conf/local.conf 5>run bitbake to build rootfs image   Test steps: After system boot up, put some video clip under /home/root/video then run below cmd (do not run chromium without any parameter, as that will start chromium with some other setting, you can check /usr/lib/chromium/chromium-wrapper) "/usr/lib/chromium/chromium-bin   --no-sandbox --ozone-platform=wayland --enable-features=VaapiVideoDecoder  --enable-accelerated-video-decode   --enable-clear-hevc-for-testing --ignore-gpu-blacklist --window-size=1920,1180  /home/root/video" then use mouse to click video clip and will start playback.   Reference: https://www.nxp.com/products/processors-and-microcontrollers/arm-processors/i-mx-applications-processors/i-mx-8-processors:IMX8-SERIES https://www.nxp.com/design/software/embedded-software/i-mx-software/embedded-linux-for-i-mx-applications-processors:IMXLINUX https://www.chromium.org/audio-video/#:~:text=codec%20and%20container%20support https://github.com/igel-oss/meta-browser-hwdecode/blob/master/recipes-chromium/chromium/files/0001-Add-support-for-V4L2VDA-on-Linux.patch      
查看全文
The following document contains a list of document, questions and discussions that are relevant in the community based on amount of views. If you are having a problem, doubt or getting started in i.MX processors, you should check the following links to see if your doubt is in there. Yocto Project Freescale Yocto Project main page‌ Yocto Training - HOME‌ i.MX Yocto Project: Frequently Asked Questions‌ Useful bitbake commands‌ Yocto Project Package Management - smart  How to add a new layer and a new recipe in Yocto  Setting up the Eclipse IDE for Yocto Application Development Guide to the .sdcard format  Yocto NFS &amp; TFTP boot  YOCTO project clean  Yocto with a package manager (ex: apt-get)  Yocto Setting the Default Ethernet address and disable DHCP on boot.  i.MX x Building QT for i.MX6  i.MX6/7 DDR Stress Test Tool V3.00  i.MX6DQSDL DDR3 Script Aid  Installing Ubuntu Rootfs on NXP i.MX6 boards  iMX6DQ MAX9286 MIPI CSI2 720P camera surround view solution for Linux BSP i.MX Design&amp;Tool Lists  Simple GPIO Example - quandry  i.MX6 GStreamer-imx Plugins - Tutorial &amp; Example Pipelines  Streaming USB Webcam over Network  Step-by-step: How to setup TI Wilink (WL18xx) with iMX6 Linux 3.10.53  Linux / Kernel Copying Files Between Windows and Linux using PuTTY  Building Linux Kernel  Patch to support uboot logo keep from uboot to kernel for NXP Linux and Android BSP (HDMI, LCD and LVDS)  load kernel from SD card in U-boot  Changing the Kernel configuration for i.MX6 SABRE  Android  The Android Booting process  What is inside the init.rc and what is it used for.  Others How to use qtmultimedia(QML) with Gstreamer 1.0
查看全文
Header 1 Header 2 Video rendering gst-launch videotestsrc ! mfw_v4lsink Audio rendering gst-launch audiotestsrc ! alsasink WAV Audio rendering gst-launch filesrc location=test.wav ! wavparse ! alsasink Video rendering selecting caps gst-launch videotestsrc ! capsfilter name='video/x-raw-yuv,format=(fourcc)I420' ! mfw_v4lsink gst-launch videotestsrc ! 'video/x-raw-yuv,format=(fourcc)I420' ! mfw_v4lsink
查看全文
Overview As more and more communication required between online and offline, the QR code is widely used in the mobile payment, mobile small apps, industry things identification and etc. The i.MX6UL/ULL has the IP of CSI and PXP for camera connection and image CSC/FLIP/ROTATION acceleration. A LCDIF IP is supporting the display, but no 3D IP support. This means this low power and low end AP is very suitable for the industry HMI segment, which does not require a cool 3D graphic display, but a simple and straightforward GUI for interaction. QR code scanner is one of the use cases in the industry segment, which more and more customer are focusing on. The i.MX6UL CPU freq of i.MX6UL is about 500Mhz, and it does not have GPU IP, so a lightweight GUI and window system is required. Here we recommend the QT with wayland backend (without X11), which would make the window system small and faster than traditional X11 UI. Why chose QT is because of it has open source version, rich components, platform independent, good performance for embedded system and strong development staffs like QtCreator for creating application. How to enable the QT development environment, check this: Enable QT developement for i.MX6UL (v2)  Here I made a QR code scanner demo based on QT5.6 + QZXing (QR/Bar code scan engine) running on the i.MX6UL EVK board with a UVC camera (at least 640x480 resolution is required) and 480x272px LCD. Source code is open here (License Apache2.0): https://github.com/muddog/QRScanner  Implementation To do camera preview and capture, you must think on the gstreamer first, which is easy use and has the acceleration pads which implemented by NXP for i.MX6UL. Yes, it's very easy for you to enable the preview in console like: $ gst-launch-1.0 v4l2src device=/dev/video1 ! video/x-raw,format=YUY2,width=640,height=320 ! imxvideoconvert_pxp ! video/x-raw,format=RGB16 ! waylandsink It works under the i.MX6UL EVK, with PXP IP to do color space convert from YUY2 -> RGB16 acceleration, also the potential scaling of the image. The CPU loading of this is about 20-30%, but if you use the component of "videoconvert" to replace the "imxvideoconvert_pxp", we do CSC and scale by CPU, then the loading would increase to 50-60%. The "/dev/video1" is the device node for UVC camera, it may different in your environment. So our target is clear, create such pipeline (with PXP acceleration) in the QT application, and use a appsink to get preview images, do simple "sink" to one QWidget by drawing this image on the widget surface for preview (say every 50ms for 20fps). Then in other thread, we fetch the preview buffer in a fixed frequency (like every 0.5s), then feed it into the ZXing engine to decode the strings inside this image. Here are the class created inside the source code: ScannerQWidgetSink It act as a gstreamer sink for preview rendering. Init the pipeline, create a timer with timeout every 50ms. In the timer handler, we use appsink to copy the camera buffer from gstreamer, and tell the ViewfinderWidget to do update (re-draw event). ViewfinderWidget This class inherit from the QWidget, which draw the preview buffer as a QImage onto it's own surface by using QPainter. The QImage is created at the very begining with the image buffer created by the ScannerQWidgetSink. Because QImage itself does not maintain the image buffer, so the buffer must be alive during it's usage. So we keep this buffer during the ScannerQWidgetSink life cycle, copy the appsink buffer from pipeline to it for preview. MainWindow Create main window, which does not have title bar and border. Start any animation for the red line scan bar. Create instance of DecoderThread and ScannerQWidgetSink. Setup and start them. DecoderThread A infinite loop, to wait for a available buffer released by the ScannerQWidgetSink every 0.5s. Copy the buffer data to it's own buffer (imgData) to avoid any change to the buffer by sink when doing decoding. Then feed this copy of buffer into ZXing engine to get decoder result. Then show on the QLabel. Screenshot under wayland (weston) desktop: Customize Camera instance Now I use the UVC camera which pluged in the USB host, which device node is /dev/video1. If you want to use CSI or other device, please change the construction parameters for ScannerQWidgetSink(): sink = new ScannerQWidgetSink(ui->widget, QString("v4l2src device=/dev/video1")); Image resolution captured and review Change the static member value of ScannerQWidgetSink class: uint ScannerQWidgetSink::CAPTURE_HEIGHT = 480; uint ScannerQWidgetSink::CAPTURE_WIDTH = 640; Preview fps and decoding frequency Find the "framerate=20/1" strings in the ScannerQWidgetSink::GstPipelineInit(), change to your fps. You also have to change the renderTimer start timeout value in the ::StartRender(). The decoding frequency is determined by renderCnt, which determine after how many preview frames showed to feed the decoder. Main window size It's fixed size of main window, you have to change the mainwindow.ui. It's easy to do in the QtCreate Designer. FAQ Why not use CSI camera in demo? Honestly, I do not have CSI camera module, it's also DNP when you buying the board on NXP.com. So a widely used UVC camera is preferred, it's also easy for you to scan QR code on your phone, your display panel etc. Why not use QCamera to do preview and capture? The QCamera class in the Qtmultimedia component uses the camerabin2 gstreamer plugin, which create a very long pipeline for different usage of viewfinder, image capture and video encoder. Camerabin2 would eat too much CPU and memory resource, take picture and recording are very very slow. The preview of 30fps would eat about 70-80% CPU loading even I hacked it using imxvideoconvert_pxp instread of software videoconvert. Finally I give up to implement the QRScanner based on QCamera. How to make sure only one instance of QT app is running? We can use QSharedMemory to create a share memory with a unique KEY. When second instance of app is started, it would check if the share memory with this KEY is created or not. If the shm is there, it means there's already one instance running, it has to exit(). But as the QT mentioned, the QSharedMemory can not be destroyed correctly when app crashed, this means we have to handle each terminate signal, and do delete by ourselves: static QSharedMemory *gShm = NULL; static void terminate(int signum) {    if (gShm) {       delete gShm;       gShm = NULL;    }    qDebug() << "Terminate with signal:" << signum;    exit(128 + signum); } int main(int argc, char *argv[]) {    QApplication a(argc, argv);    // Handle any further termination signals to ensure the    // QSharedMemory block is deleted even if the process crashes    signal(SIGHUP, terminate ); // 1    signal(SIGINT, terminate ); // 2    signal(SIGQUIT, terminate ); // 3    signal(SIGILL, terminate ); // 4    signal(SIGABRT, terminate ); // 6    signal(SIGFPE, terminate ); // 8    signal(SIGBUS, terminate ); // 10    signal(SIGSEGV, terminate ); // 11    signal(SIGSYS, terminate ); // 12    signal(SIGPIPE, terminate ); // 13    signal(SIGALRM, terminate ); // 14    signal(SIGTERM, terminate ); // 15    signal(SIGXCPU, terminate ); // 24    signal(SIGXFSZ, terminate ); // 25    gShm = new QSharedMemory("QRScannerNXP");    if (!gShm->create(4, QSharedMemory::ReadWrite)) {       delete gShm;       qDebug() << "Only allow one instance of QRScanner";       exit(0);    } .....
查看全文
This document describes the i.MX 8QXP MEK mini-SAS connectors features on Linux and Android use cases, covering the supported daughter cards, the process to change Device Tree (DTS) files or Boot images, and enable these different display options on the board.
查看全文
Recently I published this i.MX Dev Blog post about the Gateworks plugin gst-variable-rtsp-server support for i.MX 6. Now, you can check how to use it on i.MX 8 SoCs as well. 1. Preparing the image In order to use gst-variable-rtsp-server plugin, prepare your machine and distro: Add the following line to conf/local.conf: IMAGE_INSTALL_append += "gstreamer1.0-rtsp-server gst-variable-rtsp-server" Download the attached patch and apply it by doing: $ cd <yocto_path>/sources/meta-fsl-bsp-release/ $ git am ~/Download/0001-Add-RTSP-support-for-i.MX-8-L4.14.78_ga1.0.0-or-olde.patch Note: This patch is not necessary for L4.14.98_ga2.0.0 BSP! Then, build the image with bitbake and deploy it to the SD card. 2. Video Test Source Example Server $ gst-variable-rtsp-server -p 9001 -u "videotestsrc ! v4l2h264enc ! rtph264pay name=pay0 pt=96" Client 2. Camera Example Server $ gst-variable-rtsp-server -p 9001 -u "v4l2src device=/dev/video0 ! video/x-raw,width=640,height=480 ! v4l2h264enc ! rtph264pay name=pay0 pt=96" Client In order to use VLC or other application as the client, just enter the URL as shown in the image below:
查看全文
Check new updated version for with Morty here Step 1 : Get iMX Yocto AVS setup environment Review the steps under Chapter 3 of the i.MX_Yocto_Project_User'sGuide.pdf on the L4.X LINUX_DOCS to prepare your host machine. Including at least the following essential Yocto packages $ sudo apt-get install gawk wget git-core diffstat unzip texinfo \   gcc-multilib build-essential chrpath socat libsdl1.2-dev u-boot-tools Install the i.MX NXP AVS repo Create/Move to a directory where you want to install the AVS yocto build enviroment. Let's call this as <yocto_dir> $ cd <yocto_dir> $ repo init -u https://source.codeaurora.org/external/imxsupport/meta-avs-demos -b master -m imx7d-pico-avs-sdk_4.1.15-1.0.0.xml Download the AVS BSP build environment: $ repo sync Step 2: Setup yocto for Alexa_SDK image with AVS-SETUP-DEMO script: Run the avs-setup-demo script as follows to setup your environment for the imx7d-pico board: $ MACHINE=imx7d-pico DISTRO=fsl-imx-x11 source avs-setup-demo.sh -b <build_sdk> Where <build_sdk> is the name you will give to your build folder. After acepting the EULA the script will prompt if you want to enable: Sound Card selection The following Sound Cards are supported on the build: SGTL (In-board Audio Codec for PicoPi) 2-Mic Conexant The script will prompt if you are going to use the Conexant Card. If not then SGTL will be assumed as your selection Are you going to use Conexant Sound Card [Y/N]? Install Alexa SDK Next option is to select if you want to pre-install the AVS SDK software on the image. Do you want to build/include the AVS_SDK package on this image(Y/N)? If you select YES, then your image will contain the AVS SDK ready to use (after authentication). Note this AVS_SDK will not have WakeWord detection support, but it can be added on runtime. If your selection was NO, then you can always manually fetch and build the AVS_SDK on runtime. All the packages dependencies will be already there, so only fetching the AVS_SDK source code and building it is required. Finish avs-image configuration At the end you will see a text according with the configuration you select for your image build. Next is an example for a Preinstalled AVS_SDK with Conxant Sound Card support and WiFi/BT not enabled. ==========================================================   AVS configuration is now ready at conf/local.conf             - Sound Card = Conexant                                     - AVS_SDK pre-installed                                       You are ready to bitbake your AVS demo image now:               bitbake avs-image                                        ========================================================== Step 3: Build the AVS image Go to your <build_sdk> directory and start the build of the avs-image There are 2 options Regular Build: $ cd <yocto_dir>/<build_sdk> $ bitbake avs-image With QT5 support included: $ cd <yocto_dir>/<build_sdk> $ bitbake avs-image-qt5 The image with QT5 is useful if you want to add some GUI for example to render DisplayCards. Step 4 : Deploying the built images to SD/MMC card to boot on target board. After a build has succesfully completed, the created image resides at <build_sdk>/tmp/deploy/images/imx7d-pico/ In this directory, you will find the imx7d-pico-avs.sdcard image or imx7d-pico-avs-qt5.sdcard, depending on the build you chose on Step3. To Flash the .sdcard image into the eMMC device of your PicoPi board follow the next steps: Download the bootbomb flasher Follow the instruction on Section 4. Board Reflashing of the Quick Start Guide for AVS kit to setup your board on flashing mode. Copy the built SDCARD file $ sudo dd if=imx7d-pico-avs.sdcard of=/dev/sd bs=1M && sync $ sync Properly eject the pico-imx7d board: $ sudo eject /dev/sd NXP Documentation Refer to the Quick Start Quide for AVS SDK to fully setup your PicoPi board with Synaptics 2Mic and PicoPi i.mx7D For a more comprehensive understanding of Yocto, its features and setup; more image build and deployment options and customization, please take a look at the i.MX_Yocto_Project_User's_Guide.pdf document from the Linux documents bundle mentioned at the beginning of this document. For a more detailed description of the Linux BSP, u-boot use and configuration, please take a look at the i.MX_Linux_User's_Guide.pdf document from the Linux documents bundle mentioned at the beginning of this document.
查看全文
Overview: This document is written for Freescale customers who have Freescale AC3 release packages (excluded package). (If you did not have the AC3 release package, you can disregard this document.) Freescale OMX Player in Android release supports audio track selection when playing files with multiple audio tracks. However, most customers don't use this enhanced API to select the audio track even if current audio codec is not supported. To avoid a soundless output when partial audio track can be played, this document provides the method to select the available audio track automatically to play. The patch in this document is not included in our current release because it did not match with our track selection rule - play the first track. If you have any idea with this issue, feel free to add comments into this document. Issue description: Software: R13.4-GA or R13.4.1 Android releases Hardware: MX6Dual/Quad SabreSD board Test source: 1.mkv Test Step: 1. Lunch Gallery from main menu. 2. Play the video And you can see the watch the video without any sound Root Reason: The file has 2 audio track DTS & AC3: audio track 1 is DTS and track 2 is AC3. OMX Player will choose the first audio track to play as default audio track, which is DTS audio. However, the software only supports the AC3 audio codec, so it could not set up audio decoder for DTS track. If we choose to play the AC3 track, sounds could be heard. How to fix: The audio track index is set in GMPlayer::LoadParser(). You can get audio format to check whether it is supported by decoder. Please see the patch audio_track_slection.diff
查看全文
This PDF is training material for showing examples on video encoding, video decoding, video streaming on an i.MX53QSB board.
查看全文
Hello everyone, We have recently migrated our Source code from CAF (Codeaurora) to Github, so i.MX NXP old recipes/manifest that point to Codeaurora eventually will be modified so it points correctly to Github to avoid any issues while fetching using Yocto. Also, all repo init commands for old releases should be changed from: $ repo init -u https://source.codeaurora.org/external/imx/imx-manifest -b <branch name> [ -m <release manifest>] To: $ repo init -u https://github.com/nxp-imx/imx-manifest -b <branch name> [ -m <release manifest>] This will also apply to all source code that was stored in Codeaurora, the new repository for all i.MX NXP source code is: https://github.com/nxp-imx For any issues regarding this, please create a community thread and/or a support ticket. Regards, Aldo.
查看全文
gst-inspect is a tool to to get documentation about GStreamer elements. Pipeline Check installed GST elements gst-inspect | tail -1 Check installed FSL GST elements gst-inspect | grep imx Element documentation gst-inspect <gst element>
查看全文