To rule out the aiurdemux (freescale supplied) demuxer from the equation, I used mpegtsdemux which is an open source gstreamer plugin. So I launched a pipeline to demux and save the low-latency TS stream to a file:
gst-launch rtspsrc location=rtsp://<ip> ! rtpmp2tdepay ! mpegtsdemux ! 'video/x-h264' ! filesink location=/path/to/myfile.ts
Then, I passed "myfile.ts" from the above step to mxc_vpu_test.out like so:
./mxc_vpu_test.out -D "-i /root/myfile.ts -f 2 -w 1920 -h 1080 -s 1 -t 1"
I did the above 2 steps for both low latency streams and regular streams and in both cases ./mxc_vpu_test.out successfully played the generated TS file. So, obviously, the h.264 hardware can play these files but when one uses the "vpudec" plugin from gstreamer, it doesn't play! Also, the demuxing is just fine otherwise we'd get junk files. So it seems to me that FSL has deliberately designed the vpudec plugin to not play these files, even if the hardware is capable of it. Thus, I am confused about Jack's comments about low-latency being unsupported.
I am attaching the two generated TS files used in my experiment.
Please advise.