javascalaapache-sparkjvmsparkcore

How to initialise SparkContext with custom properties?


I am learning Spark using the spark-shell.

When running the spark-shell from the terminal, there is already a sparkContext provided by default. I want to add some manual settings to the spark context (like setMaster("local") and setAppName("KVApp")).

When trying to do this from the spark shell as follows:

scala> var conf= new SparkConf().setMaster("local").setAppName("MyApp")
conf: org.apache.spark.SparkConf = org.apache.spark.SparkConf@55fb92f8

scala> val sc = new SparkContext(conf)

i got the following error:

org.apache.spark.SparkException: Only one SparkContext may be running 

in this JVM (see SPARK-2243). To ignore this error, set spark.driver.allowMultipleContexts = true. The currently running SparkContext was created at:
org.apache.spark.SparkContext.<init>(SparkContext.scala:82)
org.apache.spark.repl.SparkILoop.createSparkContext(SparkILoop.scala:1017)
$iwC$$iwC.<init>(<console>:15)
$iwC.<init>(<console>:24)
<init>(<console>:26)
.<init>(<console>:30)
.<clinit>(<console>)
.<init>(<console>:7)
.<clinit>(<console>)
$print(<console>)
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
java.lang.reflect.Method.invoke(Method.java:498)
org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1346)
org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)

Which is obvious as spark context was already created when starting the spark shell.

Is there any way to achieve start spark shell with some customised properties?


Solution

  • You can do this:

    spark-shell --master "..." --name "..."
    

    You can run spark-shell --help to see all options available