I am assigned with task to create a central logging server. In my case there are many web app servers spread across. My task is to get logs from these different servers and manage in central server where there will be elastic-search
and kibana
.
Question
Things seen
Looking for way to send logs over public IP to elastic-search.
Yes, it is possible to get logs from servers that are having different public IP. You need to setup an agent like filebeat (provided by elastic) to each server which produce logs.
It will listen to your log files in each machine and forward them to the logstash instance you would mention in filebeat.yml
configuration file like below:
#=========================== Filebeat inputs =============================
filebeat.inputs:
- type: log
# Change to true to enable this input configuration.
enabled: true
# Paths that should be crawled and fetched. Glob based paths.
paths:
- /path_to_your_log_1/ELK/your_log1.log
- /path_to_your_log_2/ELK/your_log2.log
#----------------------------- Logstash output --------------------------------
output.logstash:
# The Logstash hosts
hosts: ["private_ip_of_logstash_server:5044"]
Logstash server listens to port 5044 and stream all logs through logstash configuration files:
input {
beats { port => 5044 }
}
filter {
# your log filtering logic is here
}
output {
elasticsearch {
hosts => [ "elasticcsearch_server_private_ip:9200" ]
index => "your_idex_name"
}
}
In logstash you can filter and split your logs into fields and send them to elasticsearch.