apache-sparkrddpartitioningpartitioner

Why does sortBy transformation trigger a Spark job?


As per Spark documentation only RDD actions can trigger a Spark job and the transformations are lazily evaluated when an action is called on it.

I see the sortBy transformation function is applied immediately and it is shown as a job trigger in the SparkUI. Why?


Solution

  • sortBy is implemented using sortByKey which depends on a RangePartitioner (JVM) or partitioning function (Python). When you call sortBy / sortByKey partitioner (partitioning function) is initialized eagerly and samples input RDD to compute partition boundaries. Job you see corresponds to this process.

    Actual sorting is performed only if you execute an action on the newly created RDD or its descendants.