scalaapache-sparkapache-spark-sqlapache-spark-datasetapache-spark-encoders

How to create a Dataset of Maps?


I'm using Spark 2.2 and am running into troubles when attempting to call spark.createDataset on a Seq of Map.

Code and output from my Spark Shell session follow:

// createDataSet on Seq[T] where T = Int works
scala> spark.createDataset(Seq(1, 2, 3)).collect
res0: Array[Int] = Array(1, 2, 3)

scala> spark.createDataset(Seq(Map(1 -> 2))).collect
<console>:24: error: Unable to find encoder for type stored in a Dataset.  
Primitive types (Int, String, etc) and Product types (case classes) are 
supported by importing spark.implicits._
Support for serializing other types will be added in future releases.
       spark.createDataset(Seq(Map(1 -> 2))).collect
                          ^

// createDataSet on a custom case class containing Map works
scala> case class MapHolder(m: Map[Int, Int])
defined class MapHolder

scala> spark.createDataset(Seq(MapHolder(Map(1 -> 2)))).collect
res2: Array[MapHolder] = Array(MapHolder(Map(1 -> 2)))

I've tried import spark.implicits._, though I'm fairly certain that's implicitly imported by the Spark shell session.

Is this is a case not covered by current encoders?


Solution

  • It is not covered in 2.2, but can be easily addressed. You can add required Encoder using ExpressionEncoder, either explicitly:

    import org.apache.spark.sql.catalyst.encoders.ExpressionEncoder  
    import org.apache.spark.sql.Encoder
    
    spark
      .createDataset(Seq(Map(1 -> 2)))(ExpressionEncoder(): Encoder[Map[Int, Int]])
    

    or implicitly:

    implicit def mapIntIntEncoder: Encoder[Map[Int, Int]] = ExpressionEncoder()
    spark.createDataset(Seq(Map(1 -> 2)))