algorithmsortinglinked-listcomplexity-theory

What's the fastest algorithm for sorting a linked list?


I'm curious if O(n log n) is the best a linked list can do.


Solution

  • It is reasonable to expect that you cannot do any better than O(N log N) in running time.

    However, the interesting part is to investigate whether you can sort it in-place, stably, its worst-case behavior and so on.

    Simon Tatham, of Putty fame, explains how to sort a linked list with merge sort. He concludes with the following comments:

    Like any self-respecting sort algorithm, this has running time O(N log N). Because this is Mergesort, the worst-case running time is still O(N log N); there are no pathological cases.

    Auxiliary storage requirement is small and constant (i.e. a few variables within the sorting routine). Thanks to the inherently different behaviour of linked lists from arrays, this Mergesort implementation avoids the O(N) auxiliary storage cost normally associated with the algorithm.

    There is also an example implementation in C that work for both singly and doubly linked lists.

    As @Jørgen Fogh mentions below, big-O notation may hide some constant factors that can cause one algorithm to perform better because of memory locality, because of a low number of items, etc.