I have this small gstreamer example program that links two pipelines with appsink
and appsrc
. I want to extract H264 frames from appsink, do something with them and then read them with appsrc. The whole thing seems close to working because the program does open a window and show a frame. But then the video is frozen, even though I see that appsink
sends me samples and I push them to appsrc
with the "push-buffer" signal.
My pipelines (in code) look like this:
videotestsrc
to appsink
, converting to h264 on the way:videotestsrc pattern=1 is-live=true ! video/x-raw,format=I420,width=640,height=480 ! x264enc byte-stream=true ! h264parse config-interval=1 ! video/x-h264,stream-format=byte-stream ! appsink
appsrc
and tries to print with autovideosink
:appsrc ! decodebin ! autovideosink
This is how I pass buffers from appsink
to appsrc
:
static GstFlowReturn handle_new_sample(GstAppSink *appsink, GstElement *player_appsrc) {
GstSample *sample = gst_app_sink_pull_sample(appsink);
GstFlowReturn ret = GST_FLOW_OK;
if (sample) {
// Note: this is called repeatedly, so I know I am pushing buffers to my appsrc.
char *caps = gst_caps_to_string(gst_sample_get_caps(sample));
GstBuffer *buffer = gst_sample_get_buffer(sample);
g_signal_emit_by_name(player_appsrc, "push-buffer", buffer, &ret);
gst_sample_unref(sample);
g_free(caps);
}
return ret;
}
The whole code (it's 113 lines, mostly boilerplate) is available here.
What am I doing wrong? Should I somehow tell autovideosink
to start PLAYING?
Something weird with the clock timing? I could not spot it, but try
g_object_set(player_videosink, "sync", FALSE, NULL);
Most likely you start the 2nd pipeline, but not delivering a sample instantly. So then when samples are arriving, they are arriving late from the point of view of the autovideosink and it drops everything.
Or the latency between the first frame and consecutive ones have a bigger gap, exceeding the current clock time of the sink.
Either way, video sink like to drop things when they are late and issuing a QOS event upstream.