Hi, I am using IMX53 processor based board with ov5640 camera. Rootfs ir created using Yocto recipes and Freescale MM plugins are included.
What I want to achieve is to record video all the time (without stopping) and be able to cut this video stream into sequential files by predefined time or file size.
I want to record using gstreamer and encode video with h264. I know there is multifilesink plugin for gstreamer, but as I have tested, it doesn't work with freescales encoder - maybe there are missing some metadata.
Has anyone tried something similar and had luck with it? Now I am restarting the command every 5 seconds ( to record 5 sec videos) and the result is that there are gaps between those videos, because of the recording start/stop delay.
Can it be achieved using command in terminal, or will there be involved gstreamer programming using C or Python?
Thank you
Solved! Go to Solution.
Hi Janis,
we are still investigating the root cause of this issue. We have verified that the multifilesink element works fine under Ubuntu (both scenarios: vpuenc->multifilesink and vpuenc->muxer->multifilesink). So, as a work-around (another one), you can use this system.
Leo
Hi Janis,
we are still investigating the root cause of this issue. We have verified that the multifilesink element works fine under Ubuntu (both scenarios: vpuenc->multifilesink and vpuenc->muxer->multifilesink). So, as a work-around (another one), you can use this system.
Leo
Sorry, I think I wasn't really precise by saying "multifilesink doesn't work". It does work, but it doesn't create usable media files if used after muxer - most probably because it doesn't care about that. It simply cuts buffers into required size and dumps into file, that's why first media file is usable, because it has the needed meta data, but all other created files are unusable.
For now I am trying to create a Gstreamer application based on this guys solution:
Gstreamer: Stream H264 webcam data to series of files | groak@{subjects of research}
Make sense . On MX6Q, we have found that multifilesink built with LTIB does not work as expected. In the other hand, Have you succeed with your approach? Have you tried ffplay?
Leo
Hi Leonardo,
I have tracked down the (or one) issue with 'multifilesink' with MX6Q LTIB.
Our use case is: 'vpuenc' generated a JPEG stream, which is saved as individual picture files with 'multifilesink'. We use it for our i.MX Surveillance Camera application (Argos - BlueWiki).
The application crashed with a segfault.
Attached is a patch for gst-plugins-good which should fix this problem.
Harald
Hi Harald,
Thanks for sharing the patch. I can not identify the change which fixes the problem, can you post just that part which does the fix?
Leo
Sorry,
I have updated the patch.
LTIB has put a bit too much into the patch file.
-Harald
Hello, Janis!
I success use ffplay for view result files.
Hi Janis,
Does BrilliantovKirillVladimirovich's understand right? You can also try matroskamux.
Regards,
Song Bing.
Hello, Janis!
You not right, multifilesink work and it not depend on encoder.
Follow my code for write encoded video to file:
GstElement multifilesink = gst_element_factory_make("multifilesink", "multifilesink");
g_object_set(G_OBJECT(multifilesink),
"location", file_write_name(stream, dir, prefix),
"index", start_index,
"next-file", GST_MULTI_FILE_SINK_NEXT_MAX_SIZE,
"max-files", MAX_WRITE_FILE,
"max-file-size", file_write_size(size),
NULL);
I use follow pipeline:
capture -> encode -> avimux -> multifilesink.
Hi Kirill,
if I understood correctly, you found out that constructing the element manually, multifilesink does work?
Leo
Hello, Leo!
Yes, I use it in programm and it work correct.
Hi Janis, I queried your post internally, I will update this thread asap.
Regarding your task, you may want to do it just using gst-launch. I have not tried much that sink element, but I can see there is a property 'next-file', it may be useful. In the other hand, you can build an script, having a loop and inside you ran a pipeline with X num-buffers, and the loop runs forever. A bit tricky this approach but it also avoids doing a C/Vala/Python program.
Leo
Thanks for your answer. Can you please provide more information about that script with pipeline. I really don't know how that would work.
Hi Janis,
As you found out, multisink is not working when feeding buffers from Freescale encoder, so one way is to use filesink, so check this script ( I have not tested :smileyhappy: )
#!/bin/sh
minutes=1 # represents the duration of each video chuck
fps=15
numbuffers=`expr $minutes \* 60 \* $fps`
while [ true ]
do
outfile=`date | sed 's/ //g'`.h264
echo "Recording into $outfile"
gst-launch mfw_v4lsrc num-buffers=$numbuffers ! vpuenc codec=avc ! filesink location=$outfile
done