scalaapache-sparkcassandra-2.1webui

Running Applications doesn t appear spark web Ui but runs


i need your help, i created 2 apps (one which using spray framework and the other one receive messages from kafka and send it to cassandra). Both run all the time and should never stop. I m in standalone on the server and my conf is :

- In spark_env.sh :

SPARK_MASTER_IP=MYIP
SPARK_EXECUTOR_CORES=2
SPARK_MASTER_PORT=7077
SPARK_EXECUTOR_MEMORY=4g
#SPARK_WORKER_PORT=65000
MASTER=spark://${SPARK_MASTER_IP}:${SPARK_MASTER_PORT}
SPARK_LOCAL_IP=MYIP
SPARK_MASTER_WEBUI_PORT=8080

- In spark_env.sh :
spark.master                     spark://MYIPMASTER:7077
spark.eventLog.enabled           true
spark.eventLog.dir               /opt/spark-1.6.1-bin-hadoop2.6/spark-events
spark.history.fs.logDirectory    /opt/spark-1.6.1-bin-hadoop2.6/logs
spark.io.compression.codec       lzf
spark.cassandra.connection.host MYIPMASTER
spark.cassandra.auth.username   LOGIN
spark.cassandra.auth.password   PASSWORD

I can access on both pages : MYIP:8080/ and MYIP:4040/ But on http://MYIP:8080/, i see only my workers , i can t see my application which running.

When i submit i use this :

/opt/spark-1.6.1-bin-hadoop2.6/bin/spark-submit --class MYCLASS --verbose --conf spark.eventLog.enable=true --conf spark.master.ui.port=8080 --master local[2] /opt/spark-1.6.1-bin-hadoop2.6/jars/MYJAR.jar

Why ? Could you help me?

Thanks a lot :)


Solution

  • In your spark-submit command you are using the --master as local[2] which is submitting the application in local mode. If you wants to run it on the standalone cluster that you are running then you should pass spark master URL in master option i.e. --master spark://MYIPMASTER:7077