I am using Spark 2.3.2.3.1.0.0-78. I tried to use
spark_session.sparkContext._conf.get('spark.executor.memory')
but I only received None
.
How can I get spark.executor.memory
's value?
If you received None
, it means that you're using the default value of 1g
(see the docs). Only if you specifically give spark.executor.memory
a value in your spark-submit
or pyspark
command you will be able to retrieve a non-null value.
So you can still programatically say that if the output of your ._conf.get()
is None
, your executor has 1G of memory.
Demonstration:
Starting up a pyspark
shell without any special configuration in your command line:
pyspark
And then executing the command to get the config's value gives you an empty value:
>>> sc._conf.get("spark.executor.memory")
Starting up a pyspark
shell with a different value for spark.executor.memory
:
pyspark --conf spark.executor.memory=2g
And then executing that command does return a value:
>>> sc._conf.get("spark.executor.memory")
'2g'