scalaapache-sparkoutputspark-submit

Output results of spark-submit


I'm beginner in spark and scala programing, I tried running example with spark-submit in local mode, it's run complete without any error or other message but i can't see any output result in consul or spark history web UI .Where and how can I see the results of my program in spark-submit?

This is a command that I run on spark

spark-submit --master local[*] --conf spark.history.fs.logDirectory=/tmp  /spark-events --conf spark.eventLog.enabled=true  --conf   spark.eventLog.dir=/tmp/spark-events --conf spark.history.ui.port=18080 --class com.intel.analytics.bigdl.models.autoencoder.Train dist/lib/bigdl-0.5.0-SNAPSHOT-jar-with-dependencies.jar -f /opt/work/mnist  -b 8

and this is a screenshot from end of run program


Solution

  • You can also locate your spark-defaults.conf (or spark-defaults.conf.template and copy it to spark-defaults.conf)

    Create a logging dir (like /tmp/spark-events/)

    Add these 2 lines:

    spark.eventLog.enabled           true
    spark.eventLog.dir               file:///tmp/spark-events/
    

    And run sbin/start-history-server.sh

    To make all jobs run by spark-submit log to event dir and overviews available in History Server (http://localhost:18080/) => Web UI, without keeping your spark job running

    More info: https://spark.apache.org/docs/latest/monitoring.html

    PS: On mac via homebrew this is all in the subdirs /usr/local/Cellar/apache-spark/[version]/libexec/