timeout property of plugin multisocketsink of Gstreamer does not work

Showing results for 
Search instead for 
Did you mean: 

timeout property of plugin multisocketsink of Gstreamer does not work

Contributor II

Hi there.

I am facing issues with Gstreamer.

When setting property "timeout" of "multisocketsink" plugin, the cpu run 100%.

I am using NXP SABRE Board with i.MX6Q. Also using karnel and rootfs built with yocto-L4.14.98_2.0.0_ga(Sumo).

I try to experiment with the original SD system, however,  the same thing happens.

I debug against Gstreamer and I found issues.

1. gst-plugins-base/gst/tcp/gstmultisocketsink.c:gst_multi_socket_sink_thread() sets timeout and timeout callback to main_context.

2. But does not block on subsequent calls to g_main_context_iteration(). Doing the same thing in Ubuntu will block here.

3. As a result, the while statement of gst_multi_socket_sink_thread() looks like an infinite loop.

Is there anyone in the same situation?

Best regard,

Labels (4)
0 Kudos
4 Replies

Contributor II

After all, this issue has fixed updating GLib version from 2.54.3 to 2.56.4. 

Is there a bug in GLib?

Thank you for your help.

Best regard,

0 Kudos

Contributor II

Hi igor. Thank you for your immidiate reply.

However, I think that this phenomenon I encountered is a sort of malfunction of GLib on i.MX6(ARM) and Yocto because the timeout of GSource doesn't seem to work correctly.

Additionally, curiously, The same phenomenon occurs in different environments(NXP, ARM), but it was not happend Ubuntu(x86). Plus, It happen under NXP original SD Card Linux image.I use the same version of Gstreamer on each Linux.

It will easily happen in the following command.

# gst-launch-1.0 videotestsrc ! multisocketsink : OK, CPU usage is low according to top command.

gst-launch-1.0 videotestsrc ! multisocketsink timeout=1000000000 : NG, CPU usage is 100% according to top command.

It would be nice if you could try and see if the same thing happens, or any other mentions about debugging if you think I miss something wrong.

Best regard,

0 Kudos

Contributor II

I tried the following code snippet.


#include <stdio.h>

#include <unistd.h>

#include <glib.h>

static gboolean
g_source_func(gpointer user_data) {
printf("%s\n", __func__);
return FALSE;

static gpointer
timeout_test_thread(gpointer data) {
int cnt = 0;
GMainLoop *loop = data;
GMainContext *context = g_main_loop_get_context(loop);

while(1) {
printf("%s:%d:%d\n", __func__, __LINE__, cnt);
GSource *timeout = g_timeout_source_new(2000);
g_source_set_callback(timeout, g_source_func, NULL, NULL);
g_source_attach(timeout, context);

g_main_context_iteration(context, TRUE);

cnt += 1;

return NULL;

int main(void) {

GMainLoop *loop = g_main_loop_new(NULL, FALSE);

GThread *thread = g_thread_new("timeout_test_thread", timeout_test_thread, loop);



return 0;


make and run, "g_source_func" printed after 2 seconds once. It is OK.

In NXP environment, I guess the current context is unnecessarily asserted.

0 Kudos

NXP TechSupport
NXP TechSupport

Hi Taro

one can try to debug it using:

GStreamer Debugging | GStreamer Element Debugging | Gst debug 

Best regards
Note: If this post answers your question, please click the Correct Answer button. Thank you!

0 Kudos