Hi @joanxie ,
Thanks for the queue suggestion. It definitely changes the behavior of the transcoding process, but I'm afraid it's still not fast enough.
For one, even though it's faster than before, it still takes up a lot of time (17 seconds versus 28). Another thing I noticed is that the output video is double the length of the original one and is at 30 FPS instead of 60 FPS, even though it seems like it plays at the correct speed when played.
Here's some feedback from the customer that is interested in the real time transcoding feature, with more detailed information:
Regarding the framerate of the video file, I can confirm that it is intended to be 30FPS and approximately 8 seconds long. I have however observed it playing at 60fps for some reason, even though this is not correct. I previously linked another MJPEG file that could be used as an alternative test as it has been encoded by a third party and appears to have no issues with frame rate, but is also still slow to transcode. For ease, here is the link again along with the other issues I had observed at the time:
As you discovered, the video is not processed fast enough to be used in real-time. I have confirmed this with another recorded MJPEG file (https://filesamples.com/samples/video/mjpeg/sample_1920x1080.mjpeg) and with an MJPEG IP camera. I can also confirm that the CPU is not overloaded at all when doing this. The pipeline I have used to transcode the above file is as follows:
gst-launch-1.0 -e filesrc location=sample_1920x1080.mjpeg ! jpegparse ! v4l2jpegdec ! imxvideoconvert_g2d ! video/x-raw,format=YUY2 ! imxvideoconvert_g2d ! v4l2h264enc ! h264parse ! mp4mux ! filesink location=test.mp4
When streaming from an MJPEG IP camera, it appears necessary to manually specify the video framerate in a filter cap. Otherwise the jpeg decoder consistently reports a framerate of 0fps, which is not supported by the H.264 encoder. However this can be worked around easily.
When directly linking the jpeg decoder to the h.264 encoder, the output video is corrupted as the colourspace is interpreted incorrectly by the h.264 encoder. For me, this occurs with the following pipeline using the sample video linked above:
gst-launch-1.0 filesrc location=sample_1920x1080.mjpeg ! jpegparse ! v4l2jpegdec ! v4l2h264enc ! h264parse ! mp4mux ! filesink location=test.mp4
However, this can be worked around by converting to YUY2 format and back as in the original pipeline for this sample.
It does not seem possible to convert the output of the MJPEG decoder to any RGB format (RGBA, RGBx, xRGB, BGRx etc) and then re-encode it as H.264. The error given in this case is:
g2d_opencl_conversion opencl conversion does not support input format 1 (this number varies from 1-8 depending on the RGB format selected)
The library causing this error appears to be imx-dpu-g2d, but I cannot investigate further as this is closed-source