algorithmsorting

Is it possible to reduce number of comparisons from O(N^2) to O(N) in brute-force sorting by using multiplication?


When result of a comparison is assigned to an integer:

r1 = A>B
r2 = B>C

we don't need to compare A against C because

r3 = r1 * r2 = 1

only holds when

A>C

right? Only 2 comparisons and 1 multiplication to find if A>C.

Continuing with an extra element:

r4 = C>D
r3 * r4 = A>D
r1 * r4 = not needed as a,b and c,d are independent
r2 * r4 = B>D
...

So, is there a simple multiply/add way of finding comparison-matrix of all elements without any more comparisons than N? Because with such a matrix, an array of unique elements could be sorted faster than O(N^2) comparisons (but still number of total operations including multiplications should stay same). Can this be modeled as a matrix-multiplication operation (that could be accelerated in tensor cores of CUDA GPU maybe)?

Edit: from derpirscher's comment, it should require more than just adjacent comparisons.

i: index
i > i+1
i > i+2
i > i+4
i > i+8
i > i+N/2 ----> log2 steps ---> nlogn

Solution

  • In the best case? Yes.

    If you tested the right comparisons, you can definitely sort a list in as few as n-1 comparisons. Because mostly sorted data is common in the real world, some sorting algorithms will try to find runs in the data, and use that to sort more quickly. Timsort is a good example.

    But there is no getting around Stirling's Approximation which says that ln(n!) = n ln(n) - n + O(ln(n)). Since a sorting algorithm may rearrange a list in n! ways, it therefore needs at least Ω(n * log(n)) bits (ie comparisons) in most cases. Which means that the average performance of a comparison based algorithm, let alone the worst case performance, cannot be better than that.

    Going back to your multiplication idea, what goes wrong? It's simple. While multiplication can give you information about comparisons that you didn't actually do, on average it doesn't.