scalaapache-spark

Spark : java.lang.NoClassDefFoundError: scala/collection/mutable/ArraySeq$ofRef


I am trying to run a simple word count program with spark-submit and getting an exception.

Exception in thread "main" java.lang.BootstrapMethodError: java.lang.NoClassDefFoundError: scala/collection/mutable/ArraySeq$ofRef at SparkWordCount$.main(SparkWordCount.scala:18)

The code, starting with line 18 is

val count = input.flatMap(line ⇒ line.split(" "))
    .map(word ⇒ (word, 1))
    .reduceByKey(_ + _)

My environment:


Solution

  • As stated in the comments, the solution is to use for development the same version of Scala that you will use on the cluster.