I am using rsyslog to send plain text log to logstash. But I cannot assign data to host.name or host.ip fields by grok. The system through following error:
Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"logstash-syslog-2020.09.03", :routing=>nil, :_type=>"_doc"}, #LogStash::Event:0x3273b8c], :response=>{"index"=>{"_index"=>"logstash-syslog-2020.09.03", "_type"=>"_doc", "_id"=>"i2hRU3QBeWqyaoMf1lgh", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"Could not dynamically add mapping for field [host.ip]. Existing mapping for [host] must be of type object but found [text]."}}}}
I tried with [host][name] but had an error message:
Grok regexp threw exception {:exception=>"Could not set field 'name' on object '' to value ''.This is probably due to trying to set a field like [foo][bar] = someValuewhen [foo] is not either a map or a string" ...
Here is grok configuration:
grok {
match => { "message" => "<%{INT:log.syslog.priority}>%{SYSLOGTIMESTAMP:timestamp} %{SYSLOGHOST:host.name} %{DATA:process.name}(?:\[%{POSINT:process.pid}\])?: %{GREEDYDATA:syslog_message}" }
}
My purpose is to parsing log messages according to ECS standard so that SIEM app can analyze these logs.
I think that rsyslog already sends along with the message
field another field called host
which is a simple string. Then, when you try to assign something to [host][ip]
it fails, because [host]
is not an object but a string.
What you can try to do is to rename the original host
field to something else and it should work:
Add this just before your grok
filter:
mutate {
rename => {
"[host]" => "[source][address]"
}
}