scalaapache-sparkapache-spark-sqludf

use spark SQL udf in dataframe API


How can I use a UDF which works great in spark like

sparkSession.sql("select * from chicago where st_contains(st_makeBBOX(0.0, 0.0, 90.0, 90.0), geom)").show

taken from from http://www.geomesa.org/documentation/user/spark/sparksql.html via spark`s more typesafe scala dataframe API?


Solution

  • If you have created a function, you can register the created UDF using:

    sparksession.sqlContext.udf.register(yourFunction)
    

    I hope this helps.