i.MX6 JPEG hardware decoder in Android

cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 

i.MX6 JPEG hardware decoder in Android

9,460 Views
phopkins
Contributor III

I need i.MX6 hardware support for JPEG decoding in the land of Android.  I see the i.MX6 Android 4.3 AOSP release has the following video codec support for the android.media.MediaCodec: "video/mp4v-es", "video/3gpp", "video/avc", "x-vnd.on2.vp8".  However, I do not see support for decoding JPEG.  How would one do JPEG decoding using Java code and the imx6?  Do I need to make an NDK lib and refer to the JPEG decoding in OpenMAX somehow?  If so, is there an example on how to do this with the i.MX6 JPEG lib?

Project details:

--We are making an MJPEG stream realtime at 30FPS.  I have a version of it working with the BitmapFactory.decodeByteArray but this decoding is done in software and is too slow.

--I'm using an imx6 Wandboard Quad Core with their official Android 4.3 release.  I also have the Freescale Official 4.3 release compiled and working on our own hardware, if I need to inject anything in the source.

9 Replies

2,467 Views
heyutong
Contributor I

Hey dude,do you deal with this problem?

0 Kudos

2,467 Views
phopkins
Contributor III

Here's a quick run down of freescale's multimedia stack.  I will show you what classes talk to each other and how buffers go from high level Java app to the VPU registers.   Please feel free to make any corrections, as this is a learning process for me.

This is a flow chart that follows a buffer chain from a java app using VideoView.  NOTE:  I have not identified all the layers in Java, just where the buffer goes after it leaves java.  This list goes from top-> bottom, high level -> low Level in stack.  Each layer below gets instantiated by the above layer.

VideoView.java

(... Andorid stuff here...)

mediaserver, a daemon running in android, started by zygote, called: MediaPlayerFactory::createPlayer(playerType, this, notify); /frameworks/av/media/libmediaplayerservice/MediaPlayerService.cpp

MediaPlayerFactory, called: return new OMXPlayer();  /frameworks/av/media/libmediaplayerservice/MediaPlayerFactory.cpp

OMXPlayer, called: gm = OMX_GraphManagerCreate();  /external/fsl_imx_omx/Android/OMXPlayer.cpp

GMPlayerWrapper, called: Player = FSL_NEW(GMPlayer, ());, /external/fsl_imx_omx/OpenMAXIL/src/client/GMPlayerWrapper.cpp

GMPlayer, called: ret = LoadComponent(role, &Parser); /external/fsl_imx_omx/OpenMAXIL/src/client/GMPlayer.cpp

NOTE:  This is where it gets tricky: the 'role' is a string, with the value of: "video_decoder.mjpeg".  This correlates to the module: VpuDecComponent instantiated by VpuDecoderInit in /external/fsl_imx_omx/OpenMAXIL/release/registry/component_register.  VpuDecComponent base class is 'ComponentBase', which is accepted by GMPlayer.  We will jump into VpuDecComponent from here.

VpuDecComponent, includes "vpu_wrapper.h".  This where we leave C++ and start calling compiled C libraries.  vpu_wrapper lib is called: lib_vpu_wrapper.  Stuff like 'VPU_DecDecodeBuf' is called directly from vpu_wrapper.

vpu_wrapper: includes: "vpu_lib.h".  Another library of importance: libvpu.  vpu_wrapper makes many calls to this lib, Stuff like "vpu_DecStartOneFrame"

vpu_lib: includes: "vpu_util.h".  vpu_util is part of the same vpulib.  vpu_lib call stuff like: JpegDecodeHeader.

vpu_lib: includes "vpu_io.h".  vpu_io is also part of vpulib.  vpu_lib calls stuff like: VpuWriteReg and VpuReadReg.

That is it for the stack.  There is much more to each layer than i can describe here.  However, with this list, you can see how the hardware get instantiated and how the jpeg makes it way to the hardware.  Many of these layers are contained within their own library, a library you can reference in the NDK.  My plan is to directly call 'libvpu' from the NDK as I already have the stream parsed and a JPEG in a buffer, ready for decoding.  once this is done, I need to figure out how to call the resizer to get my correct colorspace.

2,467 Views
peter_pinewski
NXP Employee
NXP Employee

Phil,

I ran across this doc Multimedia with VPU & IPU HW Acceleration in Android to help out understand the encoding.  I don't know if this helps at all.

Also, I found this:

To use hardware codec , you need to load one omx component and use it according to OpenMAX Integration Layer v1.1.2. There is sample code in:
    external/fsl_imx_omx/OpenMAXIL/test/vpu_enc_test/
    external/fsl_imx_omx/OpenMAXIL/test/vpu_test/
All components available are list in:
    external/fsl_imx_omx/OpenMAXIL/release/registry/component_register
All video encoder components are hardware encoder.
All video decoder components with libary path set to "lib_omx_vpu_dec_v2_arm11_elinux.so" are hardware decoder.

Another way is using lower level vpu_wrapper interface defined in vpu_wrapper.h, library name is lib_vpu_wrapper.so. But it's not based on existing specification and lack of document. Please refer to external/fsl_imx_omx/OpenMAXIL/src/component/vpu_dec_v2/VpuDecComponent.cpp on how to use this library.

Please let me know if this helps.

Regards,

Peter

2,467 Views
phopkins
Contributor III

Peter--

Another possibility is to use an H264 stream.  However, we require the following:

-- video stream is 480p(720x480)

-- less than 100ms delay in video, basically real time rendering of a video stream

-- We need to display two streams at the same time, so two instances of an h264 decoder

We successfully used RTSP as a wrapper on an h264 encoded stream, however the delay was over 5 seconds.  If we can achieve the above requirements, I have no issues with using H264 as well.  We went with MJPEG as I know I can decode two streams at once and I know that we can decode each frame and render in less that 100ms.  Again, I have control of the video source, so I can use any wrapper with H264 to give us the lowest latency.

Can you let me know your thoughts on this?

0 Kudos

2,467 Views
phopkins
Contributor III

Peter--

This power point gives a good overview of how the freescale video layers work in the linux layers of Android.  Some important points I gathered from the powerpoint include:

--OMX layer creates a native windows for rendering in android

--Perhaps framebuffers can stay native and reach the LVDS/HDMI output without being rendered to a SurfaceView in android.

These are very important to understand, as once you get a decoded frame, you need to render is somehow, without several expensive buffer copies.

I've tried using the lower layer of the user space VPU drivers, unsuccessfully as there are sequences and states I have yet completely figured out.  This was when I was trying to call out libvpu.so and lib_vpu_wrapper.so directly.  The next layer up is the VpuDecComponent which seems to be the next best choice, as it handles the states of vpu_wrapper and libvpu.  Im currently trying to figure out how to setup the buffer pool for this one.  Since VpuDecComponent has several base classes: VideoFilter and ComponentBase it is some what a challenge to reverse engineer it's creation and how buffers are configured and handled.

I would like to use the OMX layer, however, it wants a media stream of some kind.  I do not know how to format my MJPEG stream to make OMX player recognize it as MJPEG.  I've tried various gstreamer muxing elements to solve this.  I know the OMX player will recognize an AVI wrapper, but that is a file of known length and frame count.  Perhaps this is a problem with MPJEG stream as there is no real industry standard for an MPJEG stream mux or wrapper.

Please let me know if you have any insight on how to configure the OMX player to accept an MJPEG stream.  I have access to almost any type of muxing in gstreamer for our MJPEG stream since I control the creation of the MJPEG stream on another piece of hardware.

Thanks!

0 Kudos

2,467 Views
phopkins
Contributor III

I'm going to keep people upto date here, posting my thoughts and findings as these questions have not really been answered well from freescale.  I do not have an answer yet, however, I'm getting closer.  The following links are similar questions that have not been answered.

How to use MJPEG hardware decoder in Android with IMX6Q

MJPEG format in UvcDevice.cpp(i.MX6 Android camera HAL)

What I have found is a good amount of code and libraries that Freescale has made within their Android source.  If you download their Android Open Source Project package, you will find "external" at the root directory of the project.  This seems to be the dumping ground for proprietary and device specific code, libraries and executables.  I've been focusing on the fsl_imx_omx directory, as the majority of freescale's video code is in there.

In one of the threads, mentioned above, Daiane Angolini mentioned some useful information about hardware acceleration working in Gallery.  I was able to get an AVI that was JPEG encoded to play using Gallery, hardware accelerated.  Then I made an application using VideoPlayer SurfaceView to play TWO 480p mjpeg avi at 60FPS, at once.  The process called "mediaserver" handled this task, as it used about 10% processing power, while my application used 0%.  This proves that freescale has code somewhere that can hardware decode JPEGs. 


Upon further investigating, I found that "OMXPlayer" was being called, possibly by the mediaserver process.  However, there is no OMXPlayer process, so it's a lib that is being called.  You can find this in 'external/fsl_imx_omx/Android/'.   If you continue to study the 'external/fsl_imx_omx' folder and all the source code, you will see Freescale has made various libraries are installed and exposed to the NDK.  They have done a good job making many levels of libs that you can access, so you can control media at any point in the processing chain.  This is where I'm at, I need to figure out how to create a JPEG decoder, correctly initialize it, how to feed it frames, and how to convert the output to something that can be displayed on the screen.  I've been fining it useful to look at each Android.mk file, as you can see all the libraries that the target module is referencing, with that, you can research into each of those libraries.

Code that ran two AVIs, JPEG encoded, hardware decoded:

public void ConnectVideoOnly()

    {

    setContentView(R.layout.rtsp_view);

 

     VideoView videoView = (VideoView)this.findViewById(R.id.videoView1);

     //add controls to a MediaPlayer like play, pause.

     MediaController mc = new MediaController(this);

     videoView.setMediaController(mc);

     String path = Environment.getExternalStorageDirectory() + "/DCIM/mjpeg.avi";

     videoView.setVideoPath(path);

     //Set the focus

     videoView.requestFocus();

     VideoView vv2 = (VideoView)this.findViewById(R.id.VideoView2);

     //add controls to a MediaPlayer like play, pause.

     MediaController mc2 = new MediaController(this);

     vv2.setMediaController(mc2);

     String path2 = Environment.getExternalStorageDirectory() + "/DCIM/mjpeg2.avi";

     vv2.setVideoPath(path2);

     //Set the focus

     vv2.requestFocus();

   

   

    }

2,467 Views
daiane_angolini
NXP Employee
NXP Employee

In order to make sure your release has JPEG decoder, please take a look on MM release notes. Sometimes it depends on the release version.

And another important thing to be mentioned is that MJPEG is a senquence of several JPEG images, so maybe the video decoder is there, accelerated, but the picture decoder is not, but, from the source code point of view, it's just a matter of "how many pictures" to decode at the same "context".

0 Kudos

2,467 Views
phopkins
Contributor III

Thank you for this tip about Freescale's MM.

An MJPEG file is literally a series of JPEGs, with, each with a standard JPEG header.  In the case of the AVIs that I was playing, they are just raw JPEGS strung together with an AVI wrapper around them.  When the AVI makes it was though the Freescale multimedia stack, it gets parsed, and the stack decides it a JPEG encoded video.  You'll find a switch statement that treats MJPEG and JPEG identically, so for most purposes, MJPEG and JPEG are synonyms within Freescale's Multimedia stack.   It creates a handle to the VPU that does jpeg decoding and away it goes.  I also believe that the media stack also uses the resizer to convert from the YUV color space output( of the jpeg decoder) to an RGB format that can be rendered onto a surface.

In my case, we are streaming JPEGs from another piece of hardware I'm programming.  I'm using gstreamer on this device to produce JPEGs wrapped with gstreamers "multipartmux" with a "tcpserversink" to shoot them over a network.  However, I cannot figure out how to wrap my jpegs in order for the freescale media stack to accept my stream as JPEG/MJPEG and use the hardware decoder.  I've tried several other gstreamer mux elements identify my mjpeg stream, unsuccessfully.

I ended up writing my own demux in software, specifically for "multipartmux".  Now, I've got the jpegs in a buffer on the android system, and will process them as they come in.  I plan on feeding it to a specific part of freescales multimedia media stack.  I will describe this process in the next post, as I have took a stab at unraveling freescales multimedia stack in android, and have found some useful information.

2,467 Views
phopkins
Contributor III

*EDIT*

I believe Freescale does not provide an OpenMAX lib for JPEG encoding or decoding.  When looking at other phone/tablet configurations in the AOSP, almost no phones expose their hardware jpeg decoder in OpenMAX.  So, in order to get hardware decoding available in android, I will need to do one of the following:

--Custom driver that can talk to JNI

--Custom native daemon that exposes a network service for decoding

--Create an OpenMAX plugin with freescale's code

--Hope someone at freescale is about done with one of the above.

Making the jpeg decoder will involve porting the existing code that works in the gstreamer plugins that freescale has for linux.  Hopefully this code will compile easily with the "ndk-build" command and make file system.

Can I get verification from anyone on this?  Perhaps a Freescale engineer?

Please see the response below.  You may or may not not be able to access the Jpeg encode with Standard OpenMAX call outs( No, I really don't know if you can or cannot, as I'm not an expert ), but there are may libraries you can access that are Freescale specific to encode/decode JPEGs