I have a scenario where Logstash uses Grok to split the message into different fields. One field, message_date
, is extracted and sent to Elasticsearch.
Right now the type of message_date
in Elasticsearch/Kibana is string
instead of date
.
The index mapping setting is "dynamic": true
.
The date string in the message looks like this yyyy-MM-dd HH:mm:ss,SSS
and the grok filter is
grok {
match => { "message" => ['%{TIMESTAMP_ISO8601:message_date} %{GREEDYDATA:message}'] }
}
I need to find a way to change the type to string
in Elasticsearch or last case in Logstash.
By default, date detection is enabled on the Elasticsearch side. The default patterns that are recognized automatically are
"dynamic_date_formats": [ "strict_date_optional_time","yyyy/MM/dd HH:mm:ss Z||yyyy/MM/dd Z"]
You now have two options:
A. You modify your Logstash config to massage your date value into one of those supported formats
B. You modify the dynamic_date_formats
to include your specific date format when creating your index:
PUT <your-index>
{
"mappings": {
"dynamic_date_formats": [
"strict_date_optional_time",
"yyyy/MM/dd HH:mm:ss Z||yyyy/MM/dd Z",
"yyyy-MM-dd HH:mm:ss,SSS"
]
}
}
The last date pattern will recognize your date format and a date
field will be created instead of a text
one