elasticsearchlogstashkibanaelastic-stackelastic-beats

How to collect log from different servers to a central server(Elastic search and kibana)


I am assigned with task to create a central logging server. In my case there are many web app servers spread across. My task is to get logs from these different servers and manage in central server where there will be elastic-search and kibana.

Question

  1. Is it possible to get logs from servers that are having different public IP? If possible how?
  2. How much resource (CPU, Memory, Storage) is required in central server.

Things seen

Looking for way to send logs over public IP to elastic-search.


Solution

    1. Yes, it is possible to get logs from servers that are having different public IP. You need to setup an agent like filebeat (provided by elastic) to each server which produce logs.

      • You need to setup filebeat instance in each machine.

    It will listen to your log files in each machine and forward them to the logstash instance you would mention in filebeat.yml configuration file like below:

    #=========================== Filebeat inputs =============================
    
    filebeat.inputs:
    
    - type: log
    
      # Change to true to enable this input configuration.
      enabled: true
    
      # Paths that should be crawled and fetched. Glob based paths.
      paths:
        - /path_to_your_log_1/ELK/your_log1.log
        - /path_to_your_log_2/ELK/your_log2.log
    
    #----------------------------- Logstash output --------------------------------
    output.logstash:
      # The Logstash hosts
      hosts: ["private_ip_of_logstash_server:5044"]