elasticsearchapache-sparkelasticsearch-hadoop

How to set es.nodes parameter to multiple Elasticsearch nodes for Spark ?


So I want to read data from multiple Elasticsearch nodes into Spark. I prefer to use the "es.nodes" parameter and set "es.nodes.discovery" to false. The configuration parameters are described here. I tried to find some example on how to set "es.nodes" to an array of values but I couldn't. Help please.


Solution

  • use this :

    passthrough {
    es.nodes.discovery: false
    es.nodes: "foobar:9200"
    }