Using Grafana to graph a simple timeline from prometheus:
Why there is a such difference between rate
and avg
function?
rate
give point less then minimum value
Note: In the screenshot, you are using irate
instead of rate
.
Anyway, both are functions to be used with monotonically incrementing counters and calculate how fast the counter increases. Memory usage is not a counter, it is a gauge, so using (i)rate
is not useful. See doc on metric types.
Your graph shows how much the memory usage increases per second, not how much memory is used, what you probably want. You can see that you get a spike in irate
each time the memory usage drops instead of increments. This drop is interpreted by (i)rate
as a reset of the counter, so after the drop/reset, Prometheus thinks the counter immediately rockets from 0 to the actual memory usage in just a second.
See the docs for irate
and the docs for rate