I have following pipeline, the requirement is, I need to write "metrics" data to ONE file and EVENT data to another file. I am having two issues with this pipeline.
The file output not creating a "timestamped file" every 30 seconds, instead it is creating a file with this name output%{[@metadata][ts]}.csv
and keep on appending the data.
The CSV output, creating a new file with timestamp every 30 seconds, but somehow it is creating an extra file once with the name output%{[@metadata][ts]}
and keep on appending meta info to this file.
Can someone please guide me how I can fix this?
input {
beats {
port => 5045
}
}
filter {
ruby {
code => '
event.set("[@metadata][ts]", Time.now.to_i / 30)
event.set("[@metadata][suffix]", "output" + (Time.now.to_i / 30).to_s + ".csv")
'
}
}
filter {
metrics {
meter => [ "code" ]
add_tag => "metric"
clear_interval => 30
flush_interval => 30
}
}
output {
if "metric" in [tags] {
file {
flush_interval => 30
codec => line { format => "%{[code][count]} %{[code][count]}"}
path => "C:/lgstshop/local/csv/output%{[@metadata][ts]}.csv"
}
stdout {
codec => line {
format => "rate: %{[code][count]}"
}
}
}
file {
path => "output.log"
}
csv {
fields => [ "created", "level", "code"]
path => "C:/lgstshop/local/output%{[@metadata][ts]}.evt"
}
}
A metrics filter generates new events in the pipeline. Those events will only go through filters that come after it. Thus the metric events do not have a [@metadata][ts] field, so the sprintf references in the output section are not substituted. Move the ruby filter so that it is after the metrics filter.
If you do not want the metrics sent to the csv then wrap that output with if "metric" not in [tags] {
or put it in an else of the existing conditional.