elasticsearchlogstashlogstash-grok

Logstash add a field of type Date


I have logs that contain time in the following format: 20231030 09:41:20.179021. I want to parse these into the Date type in logstash.

This is my current filter

filter{ 
  grok { 
    match => { 
      "message" => "%{YEAR:year}(?<month>\d{2})(?<day>\d{2}) %{TIME:log_time}"
    }
  }
  mutate {
    add_field => {
      "mytime" => "%{year}-%{month}-%{day}T%{log_time}Z"
    }
    remove_field => ["day", "month", "year", "log_time"]
  }
  date {
    match => ["mytime", "yyyy-MM-dd'T'HH:mm:ss.SSSSSSZ", "ISO8601"]
    timezone => "UTC"
    target => "mytime" 
  }
}

But this doesn't seem to be working. Although the field mytime gets added, it is not of type Date, instead is of type Keyword.

What is it that I'm doing wrong?


Solution

  • As it is stated, if you haven't create a mapping, elasticsearch creates a Dynamic Mapping and index them as keyword or with other types automatically. Here is the source: https://www.elastic.co/guide/en/elasticsearch/reference/current/dynamic-mapping.html

    If you have already another mapping and you create new documents which does not fit current index and dynamic mapping is enabled, it still maps them dynamically.

    What you should do is to create a mapping and tell elasticsearch how you want to map your fields. This is called Explicit Mapping. Here is the source: https://www.elastic.co/guide/en/elasticsearch/reference/current/explicit-mapping.html

    If you have just an index and want to map it, you can directly set your explicit mapping.

    If you have 1 or more than one similar indices and want to map them explicitly, I recommend using index templates.

    If you have an existing index and want to map them with an explicit mapping, I recommend using Alias, especially if you do not want to lose your data in the old index.

    In this version: