I'm going to keep people upto date here, posting my thoughts and findings as these questions have not really been answered well from freescale. I do not have an answer yet, however, I'm getting closer. The following links are similar questions that have not been answered.
How to use MJPEG hardware decoder in Android with IMX6Q
MJPEG format in UvcDevice.cpp(i.MX6 Android camera HAL)
What I have found is a good amount of code and libraries that Freescale has made within their Android source. If you download their Android Open Source Project package, you will find "external" at the root directory of the project. This seems to be the dumping ground for proprietary and device specific code, libraries and executables. I've been focusing on the fsl_imx_omx directory, as the majority of freescale's video code is in there.
In one of the threads, mentioned above, Daiane Angolini mentioned some useful information about hardware acceleration working in Gallery. I was able to get an AVI that was JPEG encoded to play using Gallery, hardware accelerated. Then I made an application using VideoPlayer SurfaceView to play TWO 480p mjpeg avi at 60FPS, at once. The process called "mediaserver" handled this task, as it used about 10% processing power, while my application used 0%. This proves that freescale has code somewhere that can hardware decode JPEGs.
Upon further investigating, I found that "OMXPlayer" was being called, possibly by the mediaserver process. However, there is no OMXPlayer process, so it's a lib that is being called. You can find this in 'external/fsl_imx_omx/Android/'. If you continue to study the 'external/fsl_imx_omx' folder and all the source code, you will see Freescale has made various libraries are installed and exposed to the NDK. They have done a good job making many levels of libs that you can access, so you can control media at any point in the processing chain. This is where I'm at, I need to figure out how to create a JPEG decoder, correctly initialize it, how to feed it frames, and how to convert the output to something that can be displayed on the screen. I've been fining it useful to look at each Android.mk file, as you can see all the libraries that the target module is referencing, with that, you can research into each of those libraries.
Code that ran two AVIs, JPEG encoded, hardware decoded:
public void ConnectVideoOnly()
{
setContentView(R.layout.rtsp_view);
VideoView videoView = (VideoView)this.findViewById(R.id.videoView1);
//add controls to a MediaPlayer like play, pause.
MediaController mc = new MediaController(this);
videoView.setMediaController(mc);
String path = Environment.getExternalStorageDirectory() + "/DCIM/mjpeg.avi";
videoView.setVideoPath(path);
//Set the focus
videoView.requestFocus();
VideoView vv2 = (VideoView)this.findViewById(R.id.VideoView2);
//add controls to a MediaPlayer like play, pause.
MediaController mc2 = new MediaController(this);
vv2.setMediaController(mc2);
String path2 = Environment.getExternalStorageDirectory() + "/DCIM/mjpeg2.avi";
vv2.setVideoPath(path2);
//Set the focus
vv2.requestFocus();
}