A very simple filebeat
filebeat:
inputs:
- type: filestream
id: vouchers-logs-stream
paths:
- /path/to/logs/*.log
json:
keys_under_root: true
add_error_key: true
overwrite_keys: true
message_key: message
parsers:
- ndjson:
target: ""
add_error_key: true
output:
elasticsearch:
hosts: [ "..." ]
username: "..."
password: "..."
index: voucher-app-logs-%{[agent.version]}-%{+yyyy.MM.dd}
setup:
template:
name: "voucher-app-logs"
pattern: "voucher-app-logs*"
overwrite: false
ilm:
enabled: true
policy_name: "voucher-app-logs-lifecycle-policy"
This configuration creates the following:
My questions:
I have read the docs and found nothing about that. Perhaps I missed something?
Beats is doing what he is told to ^^
In the output section:
output:
elasticsearch:
hosts: [ "..." ]
username: "..."
password: "..."
index: voucher-app-logs-%{[agent.version]}-%{+yyyy.MM.dd}
^^^^^^^^^^^
This will change every day
So everyday the beat is going to look for a datastream that does not exist yet. And will create one.
Is removing the data an option for you ?
output:
elasticsearch:
hosts: [ "..." ]
username: "..."
password: "..."
index: voucher-app-logs-%{[agent.version]}