I'm feeding Logstash with Kafka input. The messages look like this:
"_index" : "progress",
"_id" : "Q27Y2IYBIZUq2eJ6WJQR",
"_score" : 1.0,
"_source" : {
"itemId" : 3,
"weight" : 358,
"timeStartedInMillis" : 39131,
"flow" : 3725,
"@version" : "1",
"@timestamp" : "2023-03-13T02:41:42.313784Z",
"event" : {
"original" : "{\"timeStartedInMillis\": 39131, \"procedureId\": 3, \"temperature\": 47, \"weight\": 358, \"flow\": 3725}"
},
"type" : "log",
"temperature" : 47,
"tags" : [
"kafka_source"
]
}
How can I filter the message to send as an output only this:
{
"_index" : "progress",
"itemId" : 3,
"weight" : 358,
"timeStartedInMillis" : 39131,
"flow" : 3725,
"temperature" : 47,
}
You can use the logstash prune filter to only send specific fields to elasticsearch.
Logstash.conf:
input {...}
filter {
prune {
whitelist_names => [ "itemId", "temperature", "${some}_field" ]
}
}
output {...}
Another solution is to use the drop filter if you need it.