i.MX Processors Knowledge Base

cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 

i.MX Processors Knowledge Base

Discussions

Sort by:
                                                                                         Watch the Freescale i.MX team boot up Android 5.0 Lollipop in i.mx6 application processors—在线播放—优酷网,视频高清在线观看 The Freescale i.MX Android team has booted up Android 5.0 Lollipop in the SABRE platform for i.mx6 series. Google pushed all of the latest source for its Android release to AOSP on Nov. 5, and the Freescale Android Team started their work. With the previous 6 days to boot Android Lollipop up, the Freescale i.MX Android team enabled the basic features like connectivity, audio/video playback, sensors, inputs and display on day 7! You can see the some changes in the demo video at the beginning of the post. The Freescale i.MX Android team has closely followed almost every version of Android since it is released by AOSP and has good experience on it. Below are some snapshots and pictures for the Android Lollipop.
View full article
Notes: + Run the pipelines in the presented order + The above example streams H263 video. + the gl command is equal to 'gst-launch' (two instead of 'gst-launch'.size() chars ) + Pending work: H264 test cases and other scenarios. Scenario Shell variables and pipelines # Export always these variables on the i.MX export VSALPHA=1 export WIDTH=320 export HEIGHT=240 export SEP=20 # decoded and displayed Uni-directional: from PC to i.MX. PC is streaming 4 H.263 streams and i.MX displays all in the screen. # On i.MX (Target) gl udpsrc caps='application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H263' port=8890 ! rtph263depay ! vpudec ! mfw_isink sync=false axis-top=0 axis-left=0 disp-width=$WIDTH disp-height=$HEIGHT & gl udpsrc caps='application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H263' port=8891 ! rtph263depay ! vpudec ! mfw_isink sync=false axis-top=0 axis-left=`expr $WIDTH + $SEP` disp-width=$WIDTH disp-height=$HEIGHT & gl udpsrc caps='application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H263' port=8892 ! rtph263depay ! vpudec ! mfw_isink sync=false axis-top=`expr $HEIGHT + $SEP` axis-left=0   disp-width=$WIDTH disp-height=$HEIGHT & gl udpsrc caps='application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H263' port=8893 ! rtph263depay ! vpudec ! mfw_isink sync=false axis-top=`expr $HEIGHT + $SEP` axis-left=`expr $WIDTH + $SEP` disp-width=$WIDTH disp-height=$HEIGHT & # On PC (Source) export IP_iMX= # Place the IP address of the i.MX board gst-launch -v videotestsrc ! ffenc_h263 ! rtph263pay ! multiudpsink clients=IP_iMX:8890,IP_iMX:8891,IP_iMX:8892,$IP_iMX:8893 Uni-directional: from PC to i.MX. PC is streaming one H.264 stream and i.MX displays it on the screen # On i.MX (Target) # Make sure you set the caps correctly, specially the sprop-parameter-sets cap. The one show below is just an example and works with the source file sintel_trailer-1080p.mp4 export VSALPHA=1 GST_DEBUG=*:2 gst-launch -v udpsrc caps='application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, sprop-parameter-sets=(string)\"Z2QAMqw05gHgCJ+WEAAAAwAQAAADAwDxgxmg\\,aOl4TLIs\", payload=(int)96' port=8890 ! rtph264depay ! vpudec ! mfw_isink sync=false # On PC (Source) gst-launch -v filesrc location=sintel_trailer-1080p.mp4 typefind=true ! qtdemux ! rtph264pay ! multiudpsink clients=10.112.102.168:8890 Bi-directional: PC is streaming 4 H.263 streams to i.MX, iMX displays it and sends the four back to PC # On i.MX export IP_PC= # Place the IP address of the PC host machine gl -v udpsrc caps='application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H263' port=8890 ! rtph263depay ! vpudec ! tee name=t ! queue ! mfw_isink sync=false axis-top=0 axis-left=0 disp-width=$WIDTH disp-height=$HEIGHT t. ! queue ! vpuenc codec=5 ! rtph263pay ! udpsink host=$IP_PC port=9990 & gl -v udpsrc caps='application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H263' port=8891 ! rtph263depay ! vpudec ! tee name=t ! queue ! mfw_isink sync=false axis-top=0 axis-left=`expr $WIDTH + $SEP` disp-width=$WIDTH disp-height=$HEIGHT t. ! queue ! vpuenc codec=5 ! rtph263pay ! udpsink host=$IP_PC port=9991 & gl -v udpsrc caps='application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H263' port=8892 ! rtph263depay ! vpudec ! tee name=t ! queue ! mfw_isink sync=false axis-top=`expr $HEIGHT + $SEP` axis-left=0   disp-width=$WIDTH disp-height=$HEIGHT t. ! queue ! vpuenc codec=5 ! rtph263pay ! udpsink host=$IP_PC port=9992 & gl -v udpsrc caps='application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H263' port=8893 ! rtph263depay ! vpudec ! tee name=t ! queue ! mfw_isink sync=false axis-top=`expr $HEIGHT + $SEP` axis-left=`expr $WIDTH + $SEP` disp-width=$WIDTH disp-height=$HEIGHT t. ! queue ! vpuenc codec=5 ! rtph263pay ! udpsink host=$IP_PC port=9993 & # On PC ## Stream received from iMX export IP_iMX= # Place the IP address of the i.MX board gl -v udpsrc caps='application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H263' port=9990 ! rtph263depay ! ffdec_h263 ! xvimagesink & gl -v udpsrc caps='application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H263' port=9991 ! rtph263depay ! ffdec_h263 ! xvimagesink & gl -v udpsrc caps='application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H263' port=9992 ! rtph263depay ! ffdec_h263 ! xvimagesink & gl -v udpsrc caps='application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H263' port=9993 ! rtph263depay ! ffdec_h263 ! xvimagesink & ## Stream sent to iMX gl -v videotestsrc ! videoscale ! video/x-raw-yuv,width=\(int\)1408,height=\(int\)1152 !  ffenc_h263 ! rtph263pay ! udpsink host=$IP_iMX port=8890 & gl -v videotestsrc ! videoscale ! video/x-raw-yuv,width=\(int\)1408,height=\(int\)1152 ! ffenc_h263 ! rtph263pay ! udpsink host=$IP_iMX port=8891 & gl -v videotestsrc ! videoscale ! video/x-raw-yuv,width=\(int\)1408,height=\(int\)1152 ! ffenc_h263 ! rtph263pay ! udpsink host=$IP_iMX port=8892 & gl -v videotestsrc ! videoscale ! video/x-raw-yuv,width=\(int\)1408,height=\(int\)1152 ! ffenc_h263 ! rtph263pay ! udpsink host=$IP_iMX port=8893 &
View full article
Qt framework Qt is a cross-platform complete development framework with tools designed to streamline the creation of stunning native applications and amazing user interfaces for desktop, embedded and mobile platforms. Qt's cross-platform full framework and tools enables developers to target various desktop, embedded, mobile and real-time operating systems with one code base. Qt brings freedom to the developer saving development time, adding efficiency and ultimately shortening time to market. Building Qt Compile Qt for i.MX28 Building QT5 for i.MX53 Building QT for i.MX6 Qt on iMX6 Installing tools Installing and Configuring QT Creator (Ubuntu) Qt5 with Qt3D over Wayland rootfs Demos Qt5 Cinematic Experience Demo on i.MX6 Video - IMx 53 Qt5 qt3d demo Qt5 with Qt3D over Wayland rootfs Information Qt5 on i.MX6  DO's and DONT's Best Practices for QML
View full article
Notes: First run the playback pipeline then the streaming pipeline. The above example streams H263 video and AMR audio data. Change codec format to your needs. In case where the iMX is the streaming machine, the audio encoder 'amrnbenc' must be installed before. This scenario has not been tested Shell variables and pipelines Playback machine (receiver) # On playback machine, set either IMX2PC or PC2IMX variables, then run the pipeline ## IMX2PC: Case where PC does the playback     AUDIO_DEC_SINK="rtpamrdepay ! amrnbdec ! alsasink "     VIDEO_CAPS="\"application/x-rtp,media=(string)video,clock-rate=(int)90000,encoding-name=(string)H263-1998\""     VIDEO_DEC_SINK="rtph263pdepay ! ffdec_h263 ! autovideosink" ## End of IMX2PC Settings ## PC2IMX: Case where iMX does the playback     AUDIO_DEC_SINK="rtpamrdepay ! mfw_amrdecoder ! alsasink "     VIDEO_CAPS="\"application/x-rtp,media=(string)video,clock-rate=(int)90000,encoding-name=(string)H263-1998\""     VIDEO_DEC_SINK="rtph263pdepay ! vpudec ! mfw_v4lsink " ## End of PC2IMX Settings PLAYBACK_AUDIO="udpsrc caps=\"application/x-rtp,media=(string)audio,clock-rate=(int)8000,encoding-name=(string)AMR,encoding-params=(string)1,octet-align=(string)1\" \             port=5002 ! rtpbin.recv_rtp_sink_1 \         rtpbin. ! $AUDIO_DEC_SINK \      udpsrc port=5003 ! rtpbin.recv_rtcp_sink_1 \      rtpbin.send_rtcp_src_1 ! udpsink port=5007 sync=false async=false" PLAYBACK_VIDEO="udpsrc caps=$VIDEO_CAPS port=5000 ! rtpbin.recv_rtp_sink_0 \         rtpbin. ! $VIDEO_DEC_SINK \         udpsrc port=5001 ! rtpbin.recv_rtcp_sink_0 \         rtpbin.send_rtcp_src_0 ! udpsink port=5005 sync=false async=false" PLAYBACK_AV="$PLAYBACK_VIDEO $PLAYBACK_AUDIO" # Playback pipeline gst-launch -v gstrtpbin name=rtpbin $PLAYBACK_AV Streaming Machine (sender) # On Streaming machine, set either IMX2PC or PC2IMX variables, then run the pipeline ## IMX2PC: Case where iMX does the streaming     IP=x.x.x.x # IP address of the playback machine     VIDEO_SRC="mfw_v4lsrc"     VIDEO_ENC="vpuenc codec=h263 ! rtph263ppay "    AUDIO_ENC="audiotestsrc ! amrnbenc ! rtpamrpay " ## END IMX2PC settings ## PC2IMX: Case where PC does the streaming     IP=y.y.y.y # IP address of the playback machine     VIDEO_SRC="v4l2src"     VIDEO_ENC="ffenc_h263 ! rtph263ppay "     AUDIO_ENC="audiotestsrc ! amrnbenc ! rtpamrpay " # END PC2PC settings STREAM_AUDIO="$AUDIO_ENC ! rtpbin.send_rtp_sink_1 \         rtpbin.send_rtp_src_1 ! udpsink host=$IP port=5002 \         rtpbin.send_rtcp_src_1 ! udpsink host=$IP port=5003 sync=false async=false \         udpsrc port=5007 ! rtpbin.recv_rtcp_sink_1" STREAM_VIDEO="$VIDEO_SRC ! $VIDEO_ENC ! rtpbin.send_rtp_sink_0 \         rtpbin.send_rtp_src_0 ! queue ! udpsink host=$IP port=5000 \         rtpbin.send_rtcp_src_0 ! udpsink host=$IP port=5001 sync=false async=false \         udpsrc port=5005 ! rtpbin.recv_rtcp_sink_0" STREAM_AV="$STREAM_VIDEO $STREAM_AUDIO" # Stream pipeline gst-launch -v gstrtpbin name=rtpbin $STREAM_AV
View full article
This is the prototype solution to enable second display showing different things on JB4.2.2 SabreSD. Make use of Class Presentation provided by android to be embedded into Status bar. When unlock the screen, the Presentation will show on second display. Now, the solution requires one .mp4 video placed in root sdcard. Of course, you may change it to show anything. The attached Files are a layout xml file, a patch and a recorded video. The layout file should be put into android/frameworks/base/packages/SystemUI/res/layout/ folder. The patch should be applied to frameworks/base.git. The recorded video shows the dual display demo as a reference.
View full article
Qt Creator can be a very good IDE in order to develop great QT applications. This IDE does not only helps with syntax highlighting, access to examples and tutorials, but also helps you to configure different toolchains Qt binary versions and target options. First download the binary installer from: For 32 bits: $ wget http://releases.qt-project.org/qtcreator/2.6.2/qt-creator-linux-x86-opensource-2.6.2.bin For 64 bits: $ wget http://releases.qt-project.org/qtcreator/2.6.2/qt-creator-linux-x86_64-opensource-2.6.2.bin execute the binary $ ./qt-creator-linux-x86_64-opensource-2.6.2.bin Follow the Installer GUI and choose a location. Default options should be OK. in my case the installation was done here: $ /home/b35153/qtcreator-2.6.2/bin Open Qt Creator (in my case from command line, use "&" to regain control of the terminal) $./qtcreator & Open Tools -> Options Choose Build & Run  on the menu of the left. and Select the Compilers Tab Here you can add the toolchain GCC compiler of your convenience. It will appear in the "Manual"  section. Now click on Qt Version Tab.  Here you can add the Qmake that you had created with your Qt installation; for example, the Qt5 installation described here: Building QT for i.MX6 It will appear in the Manual section. In my case I have Qmake for PC and Qmake for i.MX6. Now click on Kits Tab Here you can create combinations of Compilers and Qmake, and also specify where do you want the executables to go. In my case here I combined the i.MX6 toolchain and the Qmake for I.MX6 i had created. I did not set up device configuration since the sysroot is already shared to my device via NFS, but you can configure it so the files are sent via ssh to your device. And that's It! Next time you load a project you can choose which Kit you want to work on, and it will be compiled just as you need.
View full article
Overview: This document is written for Freescale customers who have Freescale AC3 release packages (excluded package). (If you did not have the AC3 release package, you can disregard this document.) Freescale OMX Player in Android release supports audio track selection when playing files with multiple audio tracks. However, most customers don't use this enhanced API to select the audio track even if current audio codec is not supported. To avoid a soundless output when partial audio track can be played, this document provides the method to select the available audio track automatically to play. The patch in this document is not included in our current release because it did not match with our track selection rule - play the first track. If you have any idea with this issue, feel free to add comments into this document. Issue description: Software: R13.4-GA or R13.4.1 Android releases Hardware: MX6Dual/Quad SabreSD board Test source: 1.mkv Test Step: 1. Lunch Gallery from main menu. 2. Play the video And you can see the watch the video without any sound Root Reason: The file has 2 audio track DTS & AC3: audio track 1 is DTS and track 2 is AC3. OMX Player will choose the first audio track to play as default audio track, which is DTS audio. However, the software only supports the AC3 audio codec, so it could not set up audio decoder for DTS track. If we choose to play the AC3 track, sounds could be heard. How to fix: The audio track index is set in GMPlayer::LoadParser(). You can get audio format to check whether it is supported by decoder. Please see the patch audio_track_slection.diff
View full article
Test digital zoom with ipu for camera preview.   Board :sarbre-sd (imx6dq) BSP   : android 13.4ga In the above flow, one frame buffer is processed in four steps at camera preview. Add the step to change the frame buffer before step 4 , the added step which  zoom one preview frame.   The figure below shows the crop function of ipu lib, we use this function scale the frame.   Test result: preview zoom levle 0:   preview zoom level max:     When taking pictures with 5M pixels and the zoom is over level 1, the picture size is not 2592x1944 but 2016x1512. The underlying reason for it is that ipu crop function only supports the 2048x2048 maximum output .   Thumbnails of test result :  
View full article
(DEPRECATED. Please check this document for Real Time Streaming) A server can be streaming video and a client, in this case a i.MX6 target, is receiving and decoding it. For example, a server with GStreamer and a web camera connected, can be streaming with the following command: $ # Pipeline 1 $ gst-launch v4l2src ! 'video/x-raw-yuv, format=(fourcc)I420, width=(int)1280, height=(int)800' ! ffenc_mpeg4 ! tcpserversink host=$CLIENT_IP port=$PORT and on the target, the client receives, decodes and display with $ # Pipeline 2 $ gst-launch tcpclientsrc host=$SERVER_IP port=$PORT  ! 'video/mpeg, width=(int)1280, height=(int)800, framerate=(fraction)10/1, mpegversion=(int)4, systemstream=(boolean)false' ! vpudec ! mfw_isink The filter caps between the tcpclientsrc and the decoder (vpudec) depend on the sink caps coming from the server encoder (ffenc_mpeg4), so these may change depending on your needs. Running the above pipelines require the environment variables SERVER_IP, CLIENT_IP and PORT. In case you want the i.MX6 to act as a server, just change the video source (either mfw_v4lsrc of v4l2src) and the encoder (vpuenc), so $ # Pipeline 3 $  gst-launch v4l2src  !  'video/x-raw-yuv, format=(fourcc)I420, width=(int)640, height=(int)480, interlaced=(boolean)false, framerate=(fraction)10/1'  ! vpuenc ! tcpserversink host=$CLIENT_IP port=$PORT For testing purposes, set SERVER_IP=127.0.0.1, CLIENT_IP=127.0.0.1 and PORT=500, and run pipeline 3 and 2 in two different consoles. Check with 'top' the  CPU usage and see that VPU is actually doing most of the work.
View full article
GStreamer has a simple feature to enable tracing, allowing the developer to do basic debugging. These can be done in two ways: Adding the parameter --gst-debug=LIST to the pipeline (a pipeline is a executed gst-launch command) Prepending the environment variable GST_DEBUG=LIST' LIST is a a comma-separated argument, indicating the GStreamer elements to trace. For example, if one needs to trace the sink element      $ GST_DEBUG=*sink*:5 gst-launch playbin2 uri=file:///sample.avi or      $ gst-launch playbin2 uri=file:///sample.avi --gst-debug=*sink*:5 Both commands produces the same log. In case want to trace for than one element, so can simple add the <element>:5, for example      $ GST_DEBUG=mfw_v4lsink:5,vpudec:5 gst-launch playbin2 uri=file:///sample.avi The number 5 indicates the log category, where 5 is the highest (the most verbose log you can get) and 0 produces no output (5=LOG, 4=DEBUG, 3=INFO, 2=WARN, 1=ERROR). Log can be huge in each pipeline run. One way to filter it is using the grep command. Before grepping, one needs to redirect the standard error to the standard output (GStreamer log goes always to stderr), so      $ GST_DEBUG=mfw_v4lsink:5,vpudec:5 gst-launch playbin2 uri=file:///sample.avi 2>&1 | grep <filter string> In case the log needs to be shared, it is important to remove the 'color' of the log, again, one just needs to add the parameter --gst-debug-no-color or prepend the env variable GST_DEBUG_NO_COLOR=1 ----- More shell variables that GStreamer react, can be found here https://developer.gnome.org/gstreamer/0.10/gst-running.html
View full article
There is no Freescale GStreamer element which does the JPEG decoding, so we must rely on a standard one, like 'jpegdec'. In case your Linux system was built using LTIB, in order to have the jpegdec element included on the gst-plugin-good, follow these steps: On the LTIB menuconfig, make sure the following packages are selected: gstreamer-plugins-good libjpeg libpng Remove the configure parameters '--disbale-libpng' and '--disable-jpeg' on the file './dist/lfs-5.1/gst-plugins-good/gst-plugins-good.spec' Rebuild and flash your board (or SD card) again. Image display VSALPHA=1 gst-launch filesrc location=sample.jpeg ! jpegdec ! imagefreeze ! mfw_isink Important: non 8 pixel aligned width and height is treated as not supported format in isink plugin.
View full article
Freescale does not have a specific GStreamer element to do JPEG encoding, so the standard 'jpegenc' should be used. Image Capture With a web camera gst-launch v4l2src num-buffers=1 ! jpegenc ! filesink location=sample.jpeg With an embedded camera gst-launch mfw_v4lsrc num-buffers=1 !  jpegenc ! filesink location=sample.jpeg More pipelines on GStreamer i.MX6 Pipelines
View full article
Multiple-Overlay (or Multi-Overlay) means several video playbacks on a single screen. In case multiple screens are needed, check the dual-display case GStreamer i.MX6 Multi-Display $ export VSALPHA=1 $ SAMPLE1=sample1.avi; SAMPLE2=sample2.avi; SAMPLE3=sample3.avi; SAMPLE4=sample4.avi; $ WIDTH=320; HEIGHT=240; SEP=20 Four displays (2x2) $gst-launch \ playbin2 uri=file://`pwd`/$SAMPLE1 video-sink="mfw_isink axis-top=0 axis-left=0   disp-width=$WIDTH disp-height=$HEIGHT" \ playbin2 uri=file://`pwd`/$SAMPLE2 video-sink="mfw_isink axis-top=0 axis-left=`expr $WIDTH + $SEP` disp-width=$WIDTH disp-height=$HEIGHT" \ playbin2 uri=file://`pwd`/$SAMPLE3 video-sink="mfw_isink axis-top=`expr $HEIGHT + $SEP` axis-left=0   disp-width=$WIDTH disp-height=$HEIGHT" \ playbin2 uri=file://`pwd`/$SAMPLE4 video-sink="mfw_isink axis-top=`expr $HEIGHT + $SEP` axis-left=`expr $WIDTH + $SEP` disp-width=$WIDTH disp-height=$HEIGHT" Basic rotation, (2 x 1, normal and inverted) gst-launch \ playbin2 uri=file://`pwd`/$SAMPLE1 video-sink="mfw_isink axis-top=0 axis-left=0   disp-width=$WIDTH disp-height=$HEIGHT rotation=0" \ playbin2 uri=file://`pwd`/$SAMPLE2 video-sink="mfw_isink axis-top=`expr $HEIGHT + $SEP` axis-left=0 disp-width=$WIDTH disp-height=$HEIGHT rotation=3"
View full article
1. Set up HDMI Set up your kernel to use HDMI adding the following code to bootargs on u-boot: video=mxcfb0:dev=hdmi,1920x1080M@60,if=RGB24 2. Test raw audio In order to test only raw audio, use the following command: aplay -D hw:1,0 Kaleidoscope.wav 3. Make HDMI audio the default output In order to configure audio output over HDMI, please, replace content of file ~/.asoundrc to the following one pcm.dmix_48000{      type dmix      ipc_key 5678293      ipc_key_add_uid yes      slave{           pcm "hw:1,0"           period_time 0           period_size 2048           buffer_size 24576           format S16_LE           rate 48000      } } pcm.!dsnoop_44100{      type dsnoop      ipc_key 5778293      ipc_key_add_uid yes      slave{           pcm "hw:0,0"           period_time 0           period_size 2048           buffer_size 24576           format S16_LE           rate 44100      } } pcm.!dsnoop_48000{      type dsnoop      ipc_key 5778293      ipc_key_add_uid yes      slave{           pcm "hw:1,0"           period_time 0           period_size 2048           buffer_size 24576           format S16_LE           rate 48000      } } pcm.asymed{      type asym      playback.pcm "dmix_48000"      capture.pcm "dsnoop_44100" } pcm.dsp0{      type plug      slave.pcm "asymed" } pcm.!default{      type plug      route_policy "average"      slave.pcm "asymed" } ctl.mixer0{      type hw      card 0 } This will configure alsa to use sound card hw:1,0. Please, pay attention to use the proper audio card name for your device. In order to see available sound cards on board: root@imx53qsb:~# aplay -l **** List of PLAYBACK Hardware Devices **** card 0: imx3stack [imx-3stack], device 0: SGTL5000 SGTL5000-0 []   Subdevices: 1/1   Subdevice #0: subdevice #0 card 1: imx3stackspdif [imx-3stack-spdif], device 0: IMX SPDIF mxc spdif-0 []   Subdevices: 1/1   Subdevice #0: subdevice #0 For detail on how to create asound.conf, please see alsa-lib configuration introduction. 4. Encoded audio For encoded (i.e. AC3, DTS) audio, you can use, for example, ac3dec, an utility provided by alsa-tools with the following command line: ac3dec -D hw:1,0 -C test.ac3 This would work for both HDMI audio and SPDIF audio. Double check your hardware and/or schematic in order to know which one to use.
View full article
What is HTML5 Video? HTML5 video is an element for the purpose of playing videos or movies in HTML5 specification. HTML5 video is intended by its creators to become the new standard way to show video on the web without plugins. Video will be shown inside the web page, like flash. HTML5 Video Web Page <video> element example <video src="movie.mp4" poster="movie.jpg" controls> </video> HTML5 video page source example <html>           <head>           </head>            <body>                      <video src="http://10.192.225.226/movie.mp4" width="640" height="480"  controls="true">                      </video>            </body> </html> HTML5 Video Rendering Path Performance Data in i.MX6Q with Android ICS With LVDS display, H264@1080p@20Mbps Can reach 30 fps With HDMI 1080p display, H264@1080p@10Mbps Can reach 25 fps HTML5 Video Website Some HTML5 Video website when accessing with android platform www.youtube.com www.iqiyi.com HTML5 reference document SPEC         http://dev.w3.org/html5/spec/single-page.html?utm_source=dlvr.it&utm_medium=feed Wikipedia page        http://en.wikipedia.org/wiki/HTML5
View full article
If you want to use a USB camera (these types of cameras are also called 'Web Cameras') with GStreamer on i.MX6 devices (Linux Kernel version >= 3.035), you need to either load the module dynamically or compile and link statically selecting (Y) the following config on the Kernel configuration      Device Drivers -> Multimedia support -> Video capture adapters -> V4L USB devices -> <*> USB Video Class (UVC) After the Kernel image has been built, flash it into the target, plug the web cam, then on a (target) terminal run      gst-launch v4l2src ! mfw_v4lsink You should see what the camera is capturing on the display. In case you need to encode the camera src data, you need to place the encoder into the pipeline      gst-launch v4l2src num-buffers=100  ! queue ! vpuenc codec=0 ! matroskamux ! filesink location=output.mkv sync=false We are using a certain codec (codec=0 means mpeg4), check options using 'gst-inspect vpuenc'.
View full article
In some cases it is desired to directly have progressive content available from a TV-IN interface through the V4L2 capture device. In the BSP, HW accelerated de-interlacing is only supported in the V4L2 output stream. Below is a patch created against a rather old BSP version that adds support for de-interlaced V4L2 capture. The patch might need to be adapted to newer BSPs, However, the logic and functionality is there and should shorten the development time. This patch adds another input device to the V4L2 framework that can be selected to perform the deinterlacing on the way to memory. The selection is done by passing the index “2” as an argument to the VIDIOC_S_INPUT  V4L2 ioctl. Attached is also a modified the tvin unit test to give an example of how to use the new driver. An example sequence for running the test is as follows: modprobe mxc_v4l2_capture ./mxc_v4l2_tvin_vdi.out -ow 720 -oh 480 -ol 10 -ot 20 -f YU12 Some key things to note: This driver does not support resize or color space conversion on the way to memory. The requested format and size should match what can be provided directly by the sensor. The driver was tested on a Sabre AI Rev A board running Linux 12.02. This code is not an official delivery and as such no guarantee of support for this code is provided by Freescale.
View full article
Description about VPU & IPU usage in Android R13.4 GA release for i.MX6DQ
View full article
In this article, some experiments are done to verify the capability of i.MX6DQ on video playback under different VPU clocks. 1. Preparation Board: i.MX6DQ SD Bitstream: 1080p sunflower with 40Mbps, it is considered as the toughest H264 clip. The original clip is copied 20 times to generate a new raw video (repeat 20 times of sun-flower clip) and then encapsulate into a mp4 container. This is to remove and minimize the influence of startup workload of gstreamer compared to vpu unit test. Kernels: Generate different kernel with different VPU clock setting: 270MHz, 298MHz, 329MHz, 352MHz, 382MHz. test setting: 1080p content decoding and display with 1080p device. (no resize) 2. Test command for VPU unit test and Gstreamer The tiled format video playback is faster than NV12 format, so in below experiment, we choose tiled format during video playback. Unit test command: (we set the frame rate -a 70, higher than 1080p 60fps HDMI refresh rate)     /unit_tests/mxc_vpu_test.out -D "-i /media/65a78bbd-1608-4d49-bca8-4e009cafac5e/sunflower_2B_2ref_WP_40Mbps.264 -f 2 -y 1 -a 70" Gstreamer command: (free run to get the highest playback speed)     gst-launch filesrc location=/media/65a78bbd-1608-4d49-bca8-4e009cafac5e/sunflower_2B_2ref_WP_40Mbps.mp4 typefind=true ! aiurdemux ! vpudec framedrop=false ! queue max-size-buffers=3 ! mfw_v4lsink sync=false 3. Video playback framerate measurement During test, we enter command "echo performance > /sys/devices/system/cpu/cpu0/cpufreq/scaling_governor" to make sure the CPU always work at highest frequency, so that it can respond to any interrupt quickly. For each testing point with different VPU clock, we do 5 rounds of tests. The max and min values are removed, and the remaining 3 data are averaged to get the final playback framerate. #1 #2 #3 #4 #5 Min Max Avg Dec Playback Dec Playback Dec Playback Dec Playback Dec Playback Playback Playback Playback 270M unit test 57.8 57.3 57.81 57.04 57.78 57.3 57.87 56.15 57.91 55.4 55.4 57.3 56.83 GST 53.76 54.163 54.136 54.273 53.659 53.659 54.273 54.01967 298M unit test 60.97 58.37 60.98 58.55 60.97 57.8 60.94 58.07 60.98 58.65 57.8 58.65 58.33 GST 56.755 49.144 53.271 56.159 56.665 49.144 56.755 55.365 329M unit test 63.8 59.52 63.92 52.63 63.8 58.1 63.82 58.26 63.78 59.34 52.63 59.52 58.56667 GST 57.815 55.857 56.862 58.637 56.703 55.857 58.637 57.12667 352M unit test 65.79 59.63 65.78 59.68 65.78 59.65 66.16 49.21 65.93 57.67 49.21 59.68 58.98333 GST 58.668 59.103 56.419 58.08 58.312 56.419 59.103 58.35333 382M unit test 64.34 56.58 67.8 58.73 67.75 59.68 67.81 59.36 67.77 59.76 56.58 59.76 59.25667 GST 59.753 58.893 58.972 58.273 59.238 58.273 59.753 59.03433 Note: Dec column means the vpu decoding fps, while Playback column means overall playback fps. Some explanation: Why does the Gstreamer performance data still improve while unit test is more flat? On Gstreamer, there is a vpu wrapper which is used to make the vpu api more intuitive to be called. So at first, the overall GST playback performance is constrained by vpu (vpu dec 57.8 fps). And finally, as vpu decoding performance goes to higher than 60fps when vpu clock increases, the constraint becomes the display refresh rate 60fps. The video display overhead of Gstreamer is only about 1 fps, similar to unit test. Based on the test result, we can see that for 352MHz, the overall 1080p video playback on 1080p display can reach ~60fps. Or if time sharing by two pipelines with two displays, we can do 2 x 1080p @ 30fps video playback. However, this experiment is valid for 1080p video playback on 1080p display. If for interlaced clip and display with size not same as 1080p, the overall playback performance is limited by some postprocessing like de-interlacing and resize.
View full article
When streaming, if you want to play a streaming URL, it can be inconvenient if the browser cannot recognize the URL as a media stream and downloads the content rather than using Gallery to play it. To create this kind of media streaming, you need to write an apk to use VideoView to play the URL/media stream from the console. Here is the command of how to play a media file or network stream from console. Gingerbread am start -n com.cooliris.media/com.cooliris.media.MovieView -d "<URL>"       The URL can be file position or network stream URL, such as: you can play a local file by: am start -n com.cooliris.media/com.cooliris.media.MovieView -d "/mnt/sdcard/test.mp4" You can also play a http stream by: am start -n com.cooliris.media/com.cooliris.media.MovieView -d "http://v.iask.com/v_play_ipad.php?vid=76710932" Or play a rtsp stream by: am start -n com.cooliris.media/com.cooliris.media.MovieView -d "rtsp://10.0.2.1:554/stream" ICS am start -n com.android.gallery3d/com.android.gallery3d.app.MovieActivity -d "<URL>"        The URL has the same definition of Gingerbread.
View full article