I'm putting tsung logs into ElasticSearch (ES) so that I can filter, visualize and compare results using Kibana.
I'm using logstash and its JSON parsing filter to push tsung logs in JSON format to ES.
Tsung logs are a bit complicated (IMO) with array objects into array objects, multiple-lines event, and several fields having the same name such as "value" in my example hereafter.
I would like to transform this event:
{
"stats":[
{"timestamp": 1317413861, "samples": [
{"name": "users", "value": 0, "max": 1},
{"name": "users_count", "value": 1, "total": 1},
{"name": "finish_users_count", "value": 1, "total": 1}]}]}
into this:
{"timestamp": 1317413861},{"users_value":0},{"users_max":1},{"users_count_value":1},{"users_count_total":1},{"finish_users_count_value":1},{"finish_users_count_total":1}
Since the entire tsung log file is forwarded to logstash at the end of a performance test campaign, I'm thinking about using regex to remove CR and unusefull stats and samples arrays before sending the event to logstash in order to simplify a little bit.
And then, I would use those kind of JSON filter options:
add_field => {"%{name}_value" => "%{value}"}
add_field => {"%{name}_max" => "%{max}"}
add_field => {"%{name}_total" => "%{total}"}
But how should I handle the fact that there are many value fields in one event for instance? What is the best thing to do?
Thanks for your help.
Feels like the ruby{} filter would be needed here. Loop across the entries in the 'samples' field, and construct your own fields based on the name/value/total/max.
There are examples of this type of behavior elsewhere on SO.