jsonelasticsearchparsingloggingfilebeat

Parsing Filebeat logs to send to Elasticsearch


Is there any existing way or tool that can parse the Filebeat logs (var/log/filebeat/filebeat-<date>.ndjson) before sending them to Elastisearch? I'm using version 8.8.0 for both Filebeat and ES.

I've tried sending the Filebeat logs, which are in JSON format (multiple lines of JSON objects per file) to ES, but each JSON object simply gets stored as a single message field, instead of broken down into the key-value pairs that is searchable.

Specifically, I'm actually only interested in a few key-value pairs in the JSON logs, so it would be even better if I can just send those fields to ES instead of storing everything.


Solution

  • You can use filebeat parsers to index filebeat-<date>.ndjson filebeat logs into elasticsearch.

    filebeat.inputs:
    - type: filestream
      ...
      parsers:
        - ndjson:
            target: ""
            message_key: msg
        - multiline:
            type: count
            count_lines: 3
    

    ndjson is a great format to index data into elasticsearch. Check the following discussion for a similar case. https://discuss.elastic.co/t/in-filebeat-processing-json-log-files-that-does-not-need-processing/344146