scalaapache-sparkdictionaryapache-spark-sql

How to convert map to dataframe?


m is a map as following:

scala> m
res119: scala.collection.mutable.Map[Any,Any] = Map(A-> 0.11164610291904906, B-> 0.11856755943424617, C -> 0.1023171832681312)

I want to get:

name  score
A  0.11164610291904906
B  0.11856755943424617
C  0.1023171832681312

How to get the final dataframe?


Solution

  • First covert it to a Seq, then you can use the toDF() function.

    val spark = SparkSession.builder.getOrCreate()
    import spark.implicits._
    
    val m = Map("A"-> 0.11164610291904906, "B"-> 0.11856755943424617, "C" -> 0.1023171832681312)
    val df = m.toSeq.toDF("name", "score")
    df.show
    

    Will give you:

    +----+-------------------+
    |name|              score|
    +----+-------------------+
    |   A|0.11164610291904906|
    |   B|0.11856755943424617|
    |   C| 0.1023171832681312|
    +----+-------------------+