dockerelasticsearchdocker-composekibanaeconnrefused

ECONNREFUSED connecting kibana and elasticsearch with docker compose


I try to connect kibana with elasticsearch using a docker-compose but I got the error: Unable to retrieve version information from Elasticsearch nodes. connect ECONNREFUSED XXX:9200

This is my docker-compose :

version: "2.2"

services:
  es01:
    image: docker.elastic.co/elasticsearch/elasticsearch:${STACK_VERSION}
    volumes:
      - esdata01:/usr/share/elasticsearch/data
    ports:
      - ${ES_PORT}:9200
    environment:
      - node.name=es01
      - cluster.name=${CLUSTER_NAME}
      - cluster.initial_master_nodes=es01
      - ELASTIC_PASSWORD=${ELASTIC_PASSWORD}
      - bootstrap.memory_lock=true
      - network.host=0.0.0.0
      - xpack.security.enabled=false
      - xpack.security.http.ssl.enabled=false
      - xpack.security.transport.ssl.enabled=false
      - xpack.license.self_generated.type=${LICENSE}
    mem_limit: ${MEM_LIMIT}
    ulimits:
      memlock:
        soft: -1
        hard: -1

  kibana:
    image: docker.elastic.co/kibana/kibana:${STACK_VERSION}
    volumes:
      - kibanadata:/usr/share/kibana/data
    ports:
      - ${KIBANA_PORT}:5601
    environment:
      - SERVERNAME=kibana
      - ELASTICSEARCH_HOSTS=https://es01:9200
      - ELASTICSEARCH_USERNAME=kibana_system
      - ELASTICSEARCH_PASSWORD=${KIBANA_PASSWORD}
      - elasticsearch.ssl.verificationMode=none
      - SERVER_HOST=0.0.0.0
    mem_limit: ${MEM_LIMIT}
    healthcheck:
      test:
        [
          "CMD-SHELL",
          "curl -s -I http://localhost:5601 | grep -q 'HTTP/1.1 302 Found'",
        ]
      interval: 10s
      timeout: 10s
      retries: 120
    
  fscrawler:
    image: dadoonet/fscrawler:2.10-SNAPSHOT
    container_name: fscrawler
    restart: always
    volumes:
      - ./data:/tmp/es:ro
      - ./config:/root/.fscrawler
      - ./logs:/usr/share/fscrawler/logs
    depends_on:
      - es01
    ports:
      - 8080:8080
    command: fscrawler job_name --restart --rest

volumes:
  certs:
    driver: local
  esdata01:
    driver: local
  kibanadata:
    driver: local

And this is my .env :

# THIS FILE IS AUTOMATICALLY GENERATED FROM /contrib/src/main/resources/xxx DIR.

# Password for the 'elastic' user (at least 6 characters)
ELASTIC_PASSWORD=changeme

# Password for the 'kibana_system' user (at least 6 characters)
KIBANA_PASSWORD=changeme

# Version of Elastic products
STACK_VERSION=8.6.2

# Set the cluster name
CLUSTER_NAME=docker-cluster

# Set to 'basic' or 'trial' to automatically start the 30-day trial
#LICENSE=basic
LICENSE=trial

# Port to expose Elasticsearch HTTP API to the host
ES_PORT=9200

# Port to expose Kibana to the host
KIBANA_PORT=5601

# Increase or decrease based on the available host memory (in bytes)
MEM_LIMIT=1073741824

# Project namespace (defaults to the current folder name if not set)
COMPOSE_PROJECT_NAME=fscrawler

Can someone please help me?

Thanks

I tried to put the port to 0.0.0.0, disable ssl and security or use a network but none of them works.


Solution

  • I found a solution for elastic with kibana and fscrawler.

    Docker-compose :

    version: "2.2"
    
    services:
      es01:
        image: docker.elastic.co/elasticsearch/elasticsearch:${STACK_VERSION}
        volumes:
          - esdata01:/usr/share/elasticsearch/data
        ports:
          - ${ES_PORT}:9200
        environment:
          - node.name=es01
          - cluster.name=${CLUSTER_NAME}
          - cluster.initial_master_nodes=es01
          - bootstrap.memory_lock=true
          - network.host=0.0.0.0
          - xpack.security.enabled=false
          - xpack.security.http.ssl.enabled=false
          - xpack.security.transport.ssl.enabled=false
          - xpack.license.self_generated.type=${LICENSE}
        mem_limit: ${MEM_LIMIT}
        ulimits:
          memlock:
            soft: -1
            hard: -1
    
      kibana:
        image: docker.elastic.co/kibana/kibana:8.6.2
        container_name: kibana-ui
        ports:
          - 5601:5601
        depends_on:
          - es01
        
      fscrawler:
        image: dadoonet/fscrawler:2.10-SNAPSHOT
        container_name: fscrawler
        restart: always
        volumes:
          - ./data:/tmp/es:ro
          - ./config:/root/.fscrawler
          - ./logs:/usr/share/fscrawler/logs
        depends_on:
          - es01
        ports:
          - 8080:8080
        command: fscrawler job_name --restart --rest
    
    volumes:
      esdata01:
        driver: local
      kibanadata:
        driver: local
    

    Config fscrawler :

        name: "job_name"
    fs:
      indexed_chars: 100%
      lang_detect: true
      continue_on_error: true
      ocr:
        language: "eng"
        enabled: true
        pdf_strategy: "ocr_and_text"
    elasticsearch:
      nodes:
        - url: "http://es01:9200"
      username: "elastic"
      password: "changeme"
      ssl_verification: false
    rest :
      url: "http://fscrawler:8080"
    

    When I launch docker-compose, everything works. To connect kibana to elastic, I can't generate an enrollment-token (ssl is disabled) so I configure it manually with http://ipcontainerelastic:9200

    This solution is not recommended (because ssl security is disabled) but can be a good solution for testing only.

    Thanks for your response, it helped me a lot. :)