logstashelastic-stacklogstash-configurationfilebeatlog-analysis

Filebeat and Logstash: logdata not passing


Hi I have 2 servers in which one has Logstash 7.2.0 and another has filbeat 7.2.0. The below is my logstash conf file:

input {

      beats {
        port => 5044
        ssl  => false
      }
    }
    output {
      elasticsearch {
        hosts => ["10.226.51.24:9200"]
        index => "oasis"
    }
            stdout { codec => rubydebug }
    }

below is my filebeat yml config:

filebeat.inputs:

# Each - is an input. Most options can be set at the input level, so
# you can use different inputs for various configurations.
# Below are the input specific configurations.

- type: log

  # Change to true to enable this input configuration.
  enabled: false

  # Paths that should be crawled and fetched. Glob based paths.
  paths:
    - D:\Oasis\Logs\InetPub_Pearson_k12internet\Ordering_Log\*
    - D:\OasisServices\Log\REST\DigitalOrders\*
#----------------------------- Logstash output --------------------------------
output.logstash:
      # The Logstash hosts
      hosts: ["110.226.51.22:5044"]

Logstash logs: I restarted the service as well.

    [2019-12-11T06:41:53,380][WARN ][logstash.runner ] SIGTERM received. Shutting down.
    [2019-12-11T06:41:58,598][WARN ][org.logstash.execution.ShutdownWatcherExt] {"inflight_count"=>0, "stalling_threads_info"=>{"other"=>[{"thread_id"=>30, "name"=>"[oasislogs]<beats", "current_call"=>"[...]/vendor/bundle/jruby/2.5.0/gems/logstash-input-beats-6.0.0-java/lib/logstash/inputs/beats.rb:204:in run'"}, {"thread_id"=>22, "name"=>"[oasislogs]>worker0", "current_call"=>"[...]/logstash-core/lib/logstash/java_pipeline.rb:239:in block in start_workers'"}, {"thread_id"=>23, "name"=>"[oasislogs]>worker1", "current_call"=>"[...]/logstash-core/lib/logstash/java_pipeline.rb:239:in block in start_workers'"}, {"thread_id"=>24, "name"=>"[oasislogs]>worker2", "current_call"=>"[...]/logstash-core/lib/logstash/java_pipeline.rb:239:in block in start_workers'"}, {"thread_id"=>25, "name"=>"[oasislogs]>worker3", "current_call"=>"[...]/logstash-core/lib/logstash/java_pipeline.rb:239:in block in start_workers'"}, {"thread_id"=>26, "name"=>"[oasislogs]>worker4", "current_call"=>"[...]/logstash-core/lib/logstash/java_pipeline.rb:239:in block in start_workers'"}, {"thread_id"=>27, "name"=>"[oasislogs]>worker5", "current_call"=>"[...]/logstash-core/lib/logstash/java_pipeline.rb:239:in block in start_workers'"}, {"thread_id"=>28, "name"=>"[oasislogs]>worker6", "current_call"=>"[...]/logstash-core/lib/logstash/java_pipeline.rb:239:in block in start_workers'"}, {"thread_id"=>29, "name"=>"[oasislogs]>worker7", "current_call"=>"[...]/logstash-core/lib/logstash/java_pipeline.rb:239:in block in start_workers'"}]}} 
    [2019-12-11T06:41:58,602][ERROR][org.logstash.execution.ShutdownWatcherExt] The shutdown process appears to be stalled due to busy or blocked plugins. Check the logs for more information. 
    [2019-12-11T06:41:59,876][INFO ][logstash.javapipeline ] Pipeline terminated {"pipeline.id"=>"oasislogs"} [2019-12-11T06:42:00,572][INFO ][logstash.runner ] Logstash shut down. [2019-12-11T06:42:10,757][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"7.2.0"} 
    [2019-12-11T06:42:14,859][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://10.16.5.24:9200/]}} [2019-12-11T06:42:14,997][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://10.16.5.24:9200/"}  
    [2019-12-11T06:42:15,033][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>7} 
    [2019-12-11T06:42:15,035][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the type` event field won't be used to determine the document _type {:es_version=>7}
    [2019-12-11T06:42:15,058][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//10.16.5.24:9200"]}
    [2019-12-11T06:42:15,111][INFO ][logstash.outputs.elasticsearch] Using default mapping template
    [2019-12-11T06:42:15,135][WARN ][org.logstash.instrument.metrics.gauge.LazyDelegatingGauge] A gauge metric of an unknown type (org.jruby.specialized.RubyArrayOneObject) has been create for key: cluster_uuids. This may result in invalid serialization. It is recommended to log an issue to the responsible developer/development team.
    [2019-12-11T06:42:15,138][INFO ][logstash.javapipeline ] Starting pipeline {:pipeline_id=>"oasislogs", "pipeline.workers"=>8, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>1000, :thread=>"#<Thread:0x67dc8344 run>"}
    [2019-12-11T06:42:15,176][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"index_patterns"=>"logstash-", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s", "number_of_shards"=>1}, "mappings"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}
    [2019-12-11T06:42:15,376][INFO ][logstash.inputs.beats ] Beats inputs: Starting input listener {:address=>"0.0.0.0:5044"}
    [2019-12-11T06:42:15,383][INFO ][logstash.javapipeline ] Pipeline started {"pipeline.id"=>"oasislogs"}
    [2019-12-11T06:42:15,454][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:oasislogs], :non_running_pipelines=>}
    [2019-12-11T06:42:15,467][INFO ][org.logstash.beats.Server] Starting server on port: 5044
    [2019-12-11T06:42:15,702][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}

Here's the filebeat logs,

|2019-12-11T23:17:34.189-0500|INFO|[monitoring]|log/log.go:145|Non-zero metrics in the last 30s|{"monitoring": {"metrics": {"beat":{"cpu":{"system":{"ticks":93},"total":{"ticks":171,"value":171},"user":{"ticks":78}},"handles":{"open":107},"info":{"ephemeral_id":"25360bca-ea57-4f32-9bb0-cacf032b86af","uptime":{"ms":65970049}},"memstats":{"gc_next":4194304,"memory_alloc":1876208,"memory_total":101684848,"rss":-8192},"runtime":{"goroutines":15}},"filebeat":{"harvester":{"open_files":0,"running":0}},"libbeat":{"config":{"module":{"running":0}},"pipeline":{"clients":0,"events":{"active":0}}},"registrar":{"states":{"current":0}}}}}|
|---|---|---|---|---|---|
|2019-12-11T23:18:04.189-0500|INFO|[monitoring]|log/log.go:145|Non-zero metrics in the last 30s|{"monitoring": {"metrics": {"beat":{"cpu":{"system":{"ticks":93},"total":{"ticks":171,"value":171},"user":{"ticks":78}},"handles":{"open":107},"info":{"ephemeral_id":"25360bca-ea57-4f32-9bb0-cacf032b86af","uptime":{"ms":66000049}},"memstats":{"gc_next":4194304,"memory_alloc":1922992,"memory_total":101731632},"runtime":{"goroutines":15}},"filebeat":{"harvester":{"open_files":0,"running":0}},"libbeat":{"config":{"module":{"running":0}},"pipeline":{"clients":0,"events":{"active":0}}},"registrar":{"states":{"current":0}}}}}|
|2019-12-11T23:18:34.192-0500|INFO|[monitoring]|log/log.go:145|Non-zero metrics in the last 30s|{"monitoring": {"metrics": {"beat":{"cpu":{"system":{"ticks":93},"total":{"ticks":171,"value":171},"user":{"ticks":78}},"handles":{"open":107},"info":{"ephemeral_id":"25360bca-ea57-4f32-9bb0-cacf032b86af","uptime":{"ms":66030049}},"memstats":{"gc_next":4194304,"memory_alloc":1964752,"memory_total":101773392,"rss":-24576},"runtime":{"goroutines":15}},"filebeat":{"harvester":{"open_files":0,"running":0}},"libbeat":{"config":{"module":{"running":0}},"pipeline":{"clients":0,"events":{"active":0}}},"registrar":{"states":{"current":0}}}}}|
|2019-12-11T23:19:04.192-0500|INFO|[monitoring]|log/log.go:145|Non-zero metrics in the last 30s|{"monitoring": {"metrics": {"beat":{"cpu":{"system":{"ticks":93},"total":{"ticks":171,"value":171},"user":{"ticks":78}},"handles":{"open":107},"info":{"ephemeral_id":"25360bca-ea57-4f32-9bb0-cacf032b86af","uptime":{"ms":66060049}},"memstats":{"gc_next":4194304,"memory_alloc":1801872,"memory_total":101819272,"rss":49152},"runtime":{"goroutines":15}},"filebeat":{"harvester":{"open_files":0,"running":0}},"libbeat":{"config":{"module":{"running":0}},"pipeline":{"clients":0,"events":{"active":0}}},"registrar":{"states":{"current":0}}}}}|

This is however not producing any outputs. Where am I going wrong? Please help me out.Thanks!


Solution

  • Please try below configuration,need to make true for enabled instead of false.

    filebeat.inputs:
    
    # Each - is an input. Most options can be set at the input level, so
    # you can use different inputs for various configurations.
    # Below are the input specific configurations.
    
    - type: log
    
      # Change to true to enable this input configuration.
      enabled: true
    
      # Paths that should be crawled and fetched. Glob based paths.
      paths:
        - D:\Oasis\Logs\InetPub_Pearson_k12internet\Ordering_Log\*
        - D:\OasisServices\Log\REST\DigitalOrders\*
    #----------------------------- Logstash output --------------------------------
    output.logstash:
          # The Logstash hosts
          hosts: ["110.226.51.22:5044"]