I have an rrd that contains a series of ping roundtrip times collected at one minute intervals:
DEF:roundtrip=$TARGET.rrd:rtt:LAST
when a try to extract the maximum value of this series in a VDEF, rrdgraph dumps core:
VDEF:rttmax=roundtrip,MAXIMUM
*** buffer overflow detected ***: terminated
I am trying to get a data series that would just plot a horizontal line that is the maximum of the series over the time span of the graph, and can be used to scale other data series in the graph.
I try:
DEF:roundtrip=$TARGET.rrd:rtt:LAST
DEF:maxrtt=$TARGET.rrd:rtt:MAX \
This yields maxrtt the same as roundtrip. (I think I understand why though, the Consolidation Function "MAX" and "LAST" work the same in the context of this rrd, which only contains the series of 1-minute interval RTT times.)
Any ideas what I am doing wrong with respect to the VDEF? Am I not going to be able to do this without putting the maximum in the rrd itself?
(Ultimately my aim is to make a graph with the ping roundtrip time on one Y-axis, and the percentage packet loss on the other Y-axis. The rountrip times vary hugely, between 0 and 1000 one some graghs and 0 and 10000 on other graphs, so that data needs to autoscale. The packet loss is fixed between 0 and 100%.)
rrdtool info:
filename = "K6ORI-LPD-TABLETOP-OMNI.rrd"
rrd_version = "0003"
step = 60
last_update = 1690739945
header_size = 896
ds[pl].index = 0
ds[pl].type = "GAUGE"
ds[pl].minimal_heartbeat = 120
ds[pl].min = 0.0000000000e+00
ds[pl].max = 1.0000000000e+02
ds[pl].last_ds = "0"
ds[pl].value = 0.0000000000e+00
ds[pl].unknown_sec = 0
ds[rtt].index = 1
ds[rtt].type = "GAUGE"
ds[rtt].minimal_heartbeat = 120
ds[rtt].min = 0.0000000000e+00
ds[rtt].max = 1.0000000000e+07
ds[rtt].last_ds = "9.099"
ds[rtt].value = 5.1636724911e+01
ds[rtt].unknown_sec = 0
rra[0].cf = "MAX"
rra[0].rows = 1500
rra[0].cur_row = 440
rra[0].pdp_per_row = 1
rra[0].xff = 5.0000000000e-01
rra[0].cdp_prep[0].value = NaN
rra[0].cdp_prep[0].unknown_datapoints = 0
rra[0].cdp_prep[1].value = NaN
rra[0].cdp_prep[1].unknown_datapoints = 0
rrdgraph command that dumps core:
rrdtool graph $TARGET.png -w 700 -h 400 -a PNG \
--start -7200 --end now --slope-mode \
--title "$TARGET" --watermark "`date`" --vertical-label "latency(ms)" \
--lower-limit 0 --x-grid MINUTE:10:HOUR:1:MINUTE:30:0:%R \
DEF:roundtrip=$TARGET.rrd:rtt:LAST \
DEF:packetloss=$TARGET.rrd:pl:LAST \
VDEF:rttmax=roundtrip,MAXIMUM \
CDEF:plscaled=rttmax,packetloss,*,100,/ \
AREA:plscaled#ff8800:"packetloss (%)" \
LINE2:roundtrip#000000:"roundtrip (ms)\n" \
GPRINT:roundtrip:LAST:"Cur\: %5.2lf" \
GPRINT:roundtrip:MAX:"Max\: %5.2lf" \
GPRINT:roundtrip:MIN:"Min\: %5.2lf\n" \
GPRINT:plscaled:LAST:"Plscaled\: %5.2lf\n" \
GPRINT:packetloss:LAST:"Packetloss\: %5.2lf\n" \
GPRINT:rttmax:LAST:"Rttmax\: %5.2lf\n" \
Your RRD file does not contain the RRA types that your graph command is expecting. This seems to be tickling some sort of bug in your version of rrdtool (which version are you using?) and you're getting the error.
To address this, you should modify the RRA types in your RRD.
Currently, you have a single RRA, type MAX, granularity 1pdp (step 60s), approximately 25 hours long.
Your graph command is trying to get data of type LAST, with a granularity in the region of 18s (implicit) over a 2 hour window.
The RRA has the requested time window, and a 60s granularity is close enough, but the type (the Consolodation Factor) is wrong. ALthough at this granularity LAST, MAX, MIN and AVG are all functionally identical, it may be confusing rrdtool.
I suggest you make the following changes:
However, if in the future you DO want to have larger graphs (weekly etc) with the MIN and MAX showing correctly, then you should consider adding MIN and MAX type RRAs as well and then adding more DEF statements to use these for new variables that you use in the MIN/MAX GPRINT lines (so you show the MAX of the MAX RRA, rather than the MAX of the AVG RRA, which is the same at the current 1pdp resolution but gets progressivle more inaccurate if you define lower-granularity RRAs in the future)