apache-spark

Multiple SparkSessions in single JVM


I have a query regarding creating multiple spark sessions in one JVM. I have read that creating multiple contexts is not recommended in earlier versions of Spark. Is it true with the SparkSession in Spark 2.0 as well.

I am thinking of making a call to a web service or a servlet from the UI, and the service creates a spark session, performs some operation and returns the result. This will result in a spark session being created for every request from the client side. Is this practice recommended ?

Say I have a method something like :

public void runSpark() throws Exception {
  SparkSession spark = SparkSession
    .builder()
    .master("spark://<masterURL>")
    .appName("JavaWordCount")
    .getOrCreate();
}

and so on....

If I put this method in a web service , will there be any JVM issues ? As such I am able invoke this method multiple times from a main method.But not sure if this is good practice.


Solution

  • It is not supported and won't be. SPARK-2243 is resolved as Won't Fix.

    If you need multiple contexts there are different projects which can help you (Mist, Livy).