apache-spark

Changing of tmp directory not working in Spark


I wanted to change the tmp directory used by spark, so I had something like that in my spark-submit.

 spark-submit <other parameters> --conf "spark.local.dir=<somedirectory>" <other parameters>

But I am noticing that it has not effect, as Spark still uses the default tmp directory. What am I doing wrong here?

By the way, I am using Spark's standalone cluster.


Solution

  • From https://spark.apache.org/docs/2.1.0/configuration.html

    In Spark 1.0 and later spark.local.‌​dir overridden by SPARK_LOCAL_DIRS (Standalone, Mesos) or LOCAL_DIRS (YARN) environment variables set by the cluster manager."