pythonapache-sparkdataframepysparkapache-spark-sql

Sort in descending order in PySpark


I'm using PySpark (Python 2.7.9/Spark 1.3.1) and have a dataframe GroupObject which I need to filter & sort in the descending order. Trying to achieve it via this piece of code.

group_by_dataframe.count().filter("`count` >= 10").sort('count', ascending=False)

But it throws the following error.

sort() got an unexpected keyword argument 'ascending'

Solution

  • In PySpark 1.3 sort method doesn't take ascending parameter. You can use desc method instead:

    from pyspark.sql.functions import col
    
    (group_by_dataframe
        .count()
        .filter("`count` >= 10")
        .sort(col("count").desc()))
    

    or desc function:

    from pyspark.sql.functions import desc
    
    (group_by_dataframe
        .count()
        .filter("`count` >= 10")
        .sort(desc("count"))
    

    Both methods can be used with with Spark >= 1.3 (including Spark 2.x).