apache-spark

Can spark-defaults.conf resolve environment variables?


If I have a line like below in my spark-env.sh file

export MY_JARS==$(jars=(/my/lib/dir/*.jar); IFS=,; echo "${jars[*]}")

which gives me a comma delimited list of jars in /my/lib/dir, is there a way I can specify

spark.jars $MY_JARS

in the spark-defaults.conf?


Solution

  • tl;dr No, it cannot, but there is a solution.

    Spark reads the conf file as a properties file without any additional env var substitution.

    What you could do however is to write the computed value MY_JARS from spark-env.sh straight to spark-defaults.conf using >> (append). The last wins so no worry there could be many similar entries.