I'm experimenting with gstreamer on an embedded system and I'm wondering if there is a way to determine the maximum amount of memory gstreamer will use. If I have a simple source -> filter -> filter -> sink
pipeline, can I figure out how many buffers each stage will allocate and what their maximum size would be?
My understanding is that I can't limit the memory usage, but I would at least like to understand the worst case scenario. Is this possible or is it too dependent on run-time conditions and/or data content. I'm also new to gstreamer, so please let me know if there is something I could add to the pipeline to make it more deterministic.
Thanks!
With gstreamer-0.10 you can use gst-tracelib (http://cgit.freedesktop.org/~ensonic/gst-tracelib/) to get e.g. peak memory consumption and various data flow relates statistics. Normally elements don't keep copies of buffers around. Exception are e.g. queue like elements and codecs (that need keep reference buffers). Many elements try to work in-place, that is they don't allocate new buffers, but rather change the buffer they received and pass it on (e.g. volume).