memoryconfigurationapache-sparkpyspark

How to find out the amount of memory pyspark has from iPython interface?


I launched with the command

IPYTHON=1 MASTER=local[4] pyspark

Spark greets me with

Welcome to spark, version 1.2.1
SparkContext availabel as sc. 

But using sc, I am not able to find the memory it has. How to find this out, and if possible how to set it to another value as well.


Solution

  • You can query the configuration of the SparkContext like so:

    sc._conf.get('spark.executor.memory')
    

    or, if you're interested in the driver's memory:

    sc._conf.get('spark.driver.memory')
    

    The complete configuration can be viewed as a list of tuples by

    sc._conf.getAll()