scalaapache-sparkapache-spark-sqluser-defined-functions

How to Acces a Wrappedarray from Dataframe's Map


I have a Dataframe like this:

    +------+------------------------------------------------------------------------------+
|myKeys|myMaps                                                                        |
+------+------------------------------------------------------------------------------+
|b     |Map(b -> WrappedArray([1,o], [4,xxx]), a -> WrappedArray([1,o], [1,n], [1,n]))|
|a     |Map(b -> WrappedArray([1,o], [4,n]), a -> WrappedArray([4,c], [1,n], [1,n]))  |
|a     |Map(b -> WrappedArray([4,o], [3,n]), a -> WrappedArray([4,o], [1,n], [1,n]))  |
|b     |Map(b -> WrappedArray([4,a], [3,n]), a -> WrappedArray([1,o], [1,n], [1,n]))  |
+------+------------------------------------------------------------------------------+

With this schema

    root
 |-- myKeys: string (nullable = false)
 |-- myMaps: map (nullable = true)
 |    |-- key: string
 |    |-- value: array (valueContainsNull = true)
 |    |    |-- element: struct (containsNull = true)
 |    |    |    |-- _1: string (nullable = true)
 |    |    |    |-- _2: string (nullable = true)

Here is the code to create it:

val x = sc.parallelize(Seq(
      Array(("a", "1", "o"), ("a", "1", "n"), ("b", "1", "o"), ("a", "1", "n"), ("b", "4", "xxx")),
      Array(("a", "1", "o"), ("a", "1", "n"), ("b", "1", "o"), ("a", "1", "n"), ("b", "4", "n")),
      Array(("a", "1", "o"), ("a", "1", "n"), ("b", "4", "o"), ("a", "1", "n"), ("b", "3", "n")),
      Array(("a", "1", "o"), ("a", "1", "n"), ("b", "4", "o"), ("a", "1", "n"), ("b", "3", "n"))
    )).map(x => testSchema(x)).toDF("myArrays")


val y = x.withColumn("myKeys", lit("b"))

val getMap = udf((mouvements: mutable.WrappedArray[Row]) => {
  val test = mouvements.toArray
    .map(line => (line(0).toString, line(1).toString, line(2).toString))
    .groupBy(_._1)
    .map{case (k,values) => k -> values.map(x => (x._2, x._3))}
  test})


val df_with_map = y.select($"myKeys", getMap($"myArrays") as "myMaps")
df_with_map show false
df_with_map printSchema

Now, I would like to access the second element of my array which have the first element equals to 4 and the key of the map equals to b. I should have a result like this

+---+
|val|
+---+
|xxx|
|c  |
|o  |
|a  |
+---+

I already try this doing it with this udf:

val getMyValue = udf{(myKey: String, myMaps:  Map[String, WrappedArray[Row]]) =>

  val first_val= "4"
  val myArrays = myMaps.get(myKey)
  val res = myArrays.get.toArray.filter{x => x.getString(0) == first_val}
  res
}

val df_value = df_with_map.select(getMyValue($"myKey",$"myMaps") as "myValue")
df_value show false
df_value printSchema

But it return the error

java.lang.UnsupportedOperationException: Schema for type org.apache.spark.sql.Row is not supported

on the line:

 val getMyValue = udf{(myKey: String, myMaps:  Map[String, WrappedArray[Row]]) =>

Do you have any idea?


Solution

  • Use:

    val first_val = "4"
    val df = Seq(
      ("b", Map("b" -> Seq(("1", "o"), ("4", "xxx"))))
    ).toDF("myKeys", "myMaps")
    
    root
     |-- myKeys: string (nullable = true)
     |-- myMaps: map (nullable = true)
     |    |-- key: string
     |    |-- value: array (valueContainsNull = true)
     |    |    |-- element: struct (containsNull = true)
     |    |    |    |-- _1: string (nullable = true)
     |    |    |    |-- _2: string (nullable = true)
    
    df.select($"myMaps".getItem("b"))
      .as[Seq[(String, String)]]
      .flatMap(xs => xs.filter(_._1 == first_val).map(_._2))
    

    Edit:

    df.as[(String, Map[String,Seq[(String, String)]])].flatMap {
      case (key, map) => 
        map.getOrElse(key, Seq[(String, String)]()).filter(_._1 == first_val).map(_._2)
    }