elasticsearchlogstashelastic-stacklogstash-configuration

Logstash cannot connect to the elastic search cluster with Xpack enabled


The difficulty I encountered was that Logstash could not connect to the Elasticsearch cluster with Xpack enabled.

This is an Elasticsearch cluster composed of at least four node nodes, which enables xpack. I set a new certificate for the transport.ssl of this cluster and applied it in the configuration file.

enter image description here

In the above screenshot, an index named "jsonfile-daemonset-syslog-2022.12.21" was created before xpack was enabled in the cluster. After xpack is enabled in the cluster, new logs cannot be sent to the cluster from the logstash and new indexes cannot be created.

root@esnode-1:/etc/elasticsearch# cat /etc/hosts
127.0.0.1 localhost

172.16.20.66 esnode-1
172.16.20.60 esnode-2
172.16.20.105 esnode-3
172.16.100.28 esnode-4
172.16.20.87   logstash

root@esnode-1:/etc/elasticsearch# cat elasticsearch.yml |grep -v '^$' | grep -v '^#'
cluster.name: will-cluster1
node.name: esnode-1
path.data: /data/elasticsearch
path.logs: /var/log/elasticsearch
network.host: 172.16.20.66
http.port: 9200
discovery.seed_hosts: ["esnode-1","esnode-2","esnode-3","esnode-4"]
cluster.initial_master_nodes: ["esnode-1","esnode-2","esnode-3","esnode-4"]
xpack.security.enabled: true
xpack.security.enrollment.enabled: true
xpack.security.http.ssl:
  enabled: true
  keystore.path: certs/http.p12
xpack.security.transport.ssl:
  enabled: true
  verification_mode: certificate
  keystore.path: certs/elastic-certificates.p12
  truststore.path: certs/elastic-certificates.p12
http.host: 0.0.0.0

$ /usr/share/elasticsearch/bin/elasticsearch-certutil ca (no set password)

$ /usr/share/elasticsearch/bin/elasticsearch-certutil cert --ca elastic-stack-ca.p12 (set password: 123456)

Note:
user: elastic
password: ednFPXyz357@#

user: kibana_system
password: kibana357xy@

user: logstash_system
password: logstashXyZ235#

root@esnode-1:/etc/elasticsearch# curl --cacert /etc/elasticsearch/certs/http_ca.crt -u elastic  https://localhost:9200
Enter host password for user 'elastic':
{
  "name" : "esnode-1",
  "cluster_name" : "will-cluster1",
  "cluster_uuid" : "5aT8AVA5STity523pJhvGQ",
  "version" : {
    "number" : "8.5.3",
    "build_flavor" : "default",
    "build_type" : "deb",
    "build_hash" : "4ed5ee9afac63de92ec98f404ccbed7d3ba9584e",
    "build_date" : "2022-12-05T18:22:22.226119656Z",
    "build_snapshot" : false,
    "lucene_version" : "9.4.2",
    "minimum_wire_compatibility_version" : "7.17.0",
    "minimum_index_compatibility_version" : "7.0.0"
  },
  "tagline" : "You Know, for Search"
}


 root@logstash:/etc/logstash# cat /etc/logstash/logstash.yml |grep -v '^$' | grep -v '^#'
        path.data: /var/lib/logstash
        path.logs: /var/log/logstash
 root@logstash:/etc/logstash#
 root@logstash:/etc/logstash# cat /etc/logstash/conf.d/logsatsh-daemonset-jsonfile-kafka-to-es.conf 
    input {
      kafka {
        bootstrap_servers => "172.16.1.67:9092,172.16.1.37:9092,172.16.1.203:9092"
        topics => ["jsonfile-log-topic"]
        codec => "json"
      }
    }
    
    output {
      stdout { codec => rubydebug }
    }
    
    
    output {
      #if [fields][type] == "app1-access-log" {
      if [type] == "jsonfile-daemonset-applog" {
        elasticsearch {
          hosts => ["https://172.16.20.66:9200","https://172.16.20.60:9200","https://172.16.20.105:9200","https://172.16.100.28:9200"]
          index => "jsonfile-daemonset-applog-%{+YYYY.MM.dd}"
          truststore => "/etc/logstash/elastic-certificates.p12"
          user => "logstash_system"
          password => "logstashXyZ235#"
        }}
    
      if [type] == "jsonfile-daemonset-syslog" {
        elasticsearch {
          hosts => ["https://172.16.20.66:9200","https://172.16.20.60:9200","https://172.16.20.105:9200","https://172.16.100.28:9200"]
          index => "jsonfile-daemonset-syslog-%{+YYYY.MM.dd}"
          truststore => "/etc/logstash/elastic-certificates.p12"
          user => "logstash_system"
          password => "logstashXyZ235#"
        }}
    
    }

The error message of starting logstash is posted here:

root@logstash:/etc/logstash/conf.d# /usr/share/logstash/bin/logstash  -f /etc/logstash/conf.d/logsatsh-daemonset-jsonfile-kafka-to-es.conf --path.settings=/etc/logstash
Using bundled JDK: /usr/share/logstash/jdk
Sending Logstash logs to /var/log/logstash which is now configured via log4j2.properties
[2022-12-24T12:09:04,135][INFO ][logstash.runner          ] Log4j configuration path used is: /etc/logstash/log4j2.properties
[2022-12-24T12:09:04,143][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"8.5.3", "jruby.version"=>"jruby 9.3.9.0 (2.6.8) 2022-10-24 537cd1f8bc OpenJDK 64-Bit Server VM 17.0.5+8 on 17.0.5+8 +indy +jit [x86_64-linux]"}
[2022-12-24T12:09:04,152][INFO ][logstash.runner          ] JVM bootstrap flags: [-Xms1g, -Xmx1g, -Djava.awt.headless=true, -Dfile.encoding=UTF-8, -Djruby.compile.invokedynamic=true, -Djruby.jit.threshold=0, -XX:+HeapDumpOnOutOfMemoryError, -Djava.security.egd=file:/dev/urandom, -Dlog4j2.isThreadContextMapInheritable=true, -Djruby.regexp.interruptible=true, -Djdk.io.File.enableADS=true, --add-exports=jdk.compiler/com.sun.tools.javac.api=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.file=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.parser=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.tree=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.util=ALL-UNNAMED, --add-opens=java.base/java.security=ALL-UNNAMED, --add-opens=java.base/java.io=ALL-UNNAMED, --add-opens=java.base/java.nio.channels=ALL-UNNAMED, --add-opens=java.base/sun.nio.ch=ALL-UNNAMED, --add-opens=java.management/sun.management=ALL-UNNAMED]
[2022-12-24T12:09:04,702][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2022-12-24T12:09:06,947][ERROR][logstash.agent           ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"Java::OrgLogstashSecretStore::SecretStoreException::LoadException", :message=>"Found a file at /etc/logstash/logstash.keystore, but it is not a valid Logstash keystore.", :backtrace=>["org.logstash.secret.store.backend.JavaKeyStore.load(JavaKeyStore.java:294)", "org.logstash.secret.store.backend.JavaKeyStore.load(JavaKeyStore.java:77)", "org.logstash.secret.store.SecretStoreFactory.doIt(SecretStoreFactory.java:129)", "org.logstash.secret.store.SecretStoreFactory.load(SecretStoreFactory.java:115)", "org.logstash.secret.store.SecretStoreExt.getIfExists(SecretStoreExt.java:60)", "org.logstash.execution.AbstractPipelineExt.getSecretStore(AbstractPipelineExt.java:582)", "org.logstash.execution.AbstractPipelineExt.initialize(AbstractPipelineExt.java:181)", "org.logstash.execution.JavaBasePipelineExt.initialize(JavaBasePipelineExt.java:72)", "org.logstash.execution.JavaBasePipelineExt$INVOKER$i$1$0$initialize.call(JavaBasePipelineExt$INVOKER$i$1$0$initialize.gen)", "org.jruby.internal.runtime.methods.JavaMethod$JavaMethodN.call(JavaMethod.java:846)", "org.jruby.ir.runtime.IRRuntimeHelpers.instanceSuper(IRRuntimeHelpers.java:1229)", "org.jruby.ir.runtime.IRRuntimeHelpers.instanceSuperSplatArgs(IRRuntimeHelpers.java:1202)", "org.jruby.ir.targets.indy.InstanceSuperInvokeSite.invoke(InstanceSuperInvokeSite.java:29)", "usr.share.logstash.logstash_minus_core.lib.logstash.java_pipeline.RUBY$method$initialize$0(/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:48)", "org.jruby.internal.runtime.methods.CompiledIRMethod.call(CompiledIRMethod.java:139)", "org.jruby.internal.runtime.methods.MixedModeIRMethod.call(MixedModeIRMethod.java:112)", "org.jruby.runtime.callsite.CachingCallSite.cacheAndCall(CachingCallSite.java:329)", "org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:87)", "org.jruby.RubyClass.newInstance(RubyClass.java:911)", "org.jruby.RubyClass$INVOKER$i$newInstance.call(RubyClass$INVOKER$i$newInstance.gen)", "org.jruby.ir.targets.indy.InvokeSite.invoke(InvokeSite.java:208)", "usr.share.logstash.logstash_minus_core.lib.logstash.pipeline_action.create.RUBY$method$execute$0(/usr/share/logstash/logstash-core/lib/logstash/pipeline_action/create.rb:50)", "usr.share.logstash.logstash_minus_core.lib.logstash.pipeline_action.create.RUBY$method$execute$0$__VARARGS__(/usr/share/logstash/logstash-core/lib/logstash/pipeline_action/create.rb:49)", "org.jruby.internal.runtime.methods.CompiledIRMethod.call(CompiledIRMethod.java:139)", "org.jruby.internal.runtime.methods.MixedModeIRMethod.call(MixedModeIRMethod.java:112)", "org.jruby.ir.targets.indy.InvokeSite.invoke(InvokeSite.java:208)", "usr.share.logstash.logstash_minus_core.lib.logstash.agent.RUBY$block$converge_state$2(/usr/share/logstash/logstash-core/lib/logstash/agent.rb:386)", "org.jruby.runtime.CompiledIRBlockBody.callDirect(CompiledIRBlockBody.java:141)", "org.jruby.runtime.IRBlockBody.call(IRBlockBody.java:64)", "org.jruby.runtime.IRBlockBody.call(IRBlockBody.java:58)", "org.jruby.runtime.Block.call(Block.java:143)", "org.jruby.RubyProc.call(RubyProc.java:309)", "org.jruby.internal.runtime.RubyRunnable.run(RubyRunnable.java:107)", "java.base/java.lang.Thread.run(Thread.java:833)"]}
[2022-12-24T12:09:07,088][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600, :ssl_enabled=>false}
[2022-12-24T12:09:07,161][INFO ][logstash.runner          ] Logstash shut down.
[2022-12-24T12:09:07,178][FATAL][org.logstash.Logstash    ] Logstash stopped processing because of an error: (SystemExit) exit
org.jruby.exceptions.SystemExit: (SystemExit) exit
    at org.jruby.RubyKernel.exit(org/jruby/RubyKernel.java:790) ~[jruby.jar:?]
    at org.jruby.RubyKernel.exit(org/jruby/RubyKernel.java:753) ~[jruby.jar:?]
    at usr.share.logstash.lib.bootstrap.environment.<main>(/usr/share/logstash/lib/bootstrap/environment.rb:91) ~[?:?]

Solution

  • @Wei Yu, it seems you are missing:

    ssl => true
    

    in your logstash yaml for ES output.