apache-sparkapache-spark-sql

Generate a Spark StructType / Schema from a case class


If I wanted to create a StructType (i.e. a DataFrame.schema) out of a case class, is there a way to do it without creating a DataFrame? I can easily do:

case class TestCase(id: Long)
val schema = Seq[TestCase]().toDF.schema

But it seems overkill to actually create a DataFrame when all I want is the schema.

(If you are curious, the reason behind the question is that I am defining a UserDefinedAggregateFunction, and to do so you override a couple of methods that return StructTypes and I use case classes.)


Solution

  • You can do it the same way SQLContext.createDataFrame does it:

    import org.apache.spark.sql.catalyst.ScalaReflection
    val schema = ScalaReflection.schemaFor[TestCase].dataType.asInstanceOf[StructType]