I am trying to build a live video streaming embedded server on Freescale imx6 with Gstreamer and h264 codecs. The client(Linux PC) is connected via a wireless link that streams the video transmitted by the server.
Recently I performed some test to measure the average bitrate for a h264 encoded file on a Freescale imx6 processor. I used the GStreamer element 'testvideosrc', ran it for 60 seconds and encoded it using the gstreamer-imx plugins that utilizes the hardware encoder (h264) and saved the file, and calculated the bitrate. Following is the command :
timeout --signal=SIGINT 60s \
gst-launch-1.0 -ev videotestsrc pattern=18 ! \
video/x-raw,width=100,height=50,framerate=10/1 ! \
imxvpuenc_h264 bitrate=5 ! h264parse ! \
filesink location=60s.h264 && \
echo $(( $(stat --format=%s 60s.h264) / 60 * 8))
The test was performed at 10 fps for different resolutions, and the bitrates are as follows:
100 x 50 ~120.3K
320 x 240 ~93.7K
640 x 480 ~52.7K
1280 x 960 ~32.3K
The bitrate I calculated was for the file transfer rate, that I require to transfer this wirelessly over the communication link.
I am trying to understand the behaviour of the H264 encoder, as to why the bitrate is inversely proportional to the increasing resolution sizes. I had hoped that with increasing resolutions, the bitrate will also increase and hence was opting for a lower resolution image, for streaming through the wireless link. Can anyone explain, why this behaviour exists with the h264 codedc? Or am I at fault in understanding the bitrate demand.
Here is the file-size of the encoded video file.
Resolution File size in bytes
100 x 50 3886780
320 x 240 704543
640 x 480 397098
1280 x 960 235108
Could anyone explain this behaviour?
Please address this question to the H264 encoder experts, who can explain this behaviour.