algorithmsortingtime-complexity

find 4th smallest element in linear time


So i had an exercise given to me about 2 months ago, that says the following:

Given n (n>=4) distinct elements, design a divide & conquer algorithm to compute the 4th smallest element. Your algorithm should run in linear time in the worst case.

I had an extremely hard time with this problem, and could only find relevant algorithms that runs in the worst case O(n*k). After several weeks of trying, we managed, with the help of our teacher, "solve" this problem. The final algorithm is as follows:

Rules: The input size can only be of size 2^k

(1): Divide input into n/2. One left array, one right array.

(2): If input size == 4, sort the arrays using merge sort.
     (2.1) Merge left array with right array into a new result array with length 4. 
     (2.2) Return element at index [4-1]

(3): Repeat step 1


This is solved recursively and our base case is at step 2. Step 2.2 means that for all
of our recursive calls that we did, we will get a final result array of length 4, and at that
point, we can justr return the element at index [4-1].

With this algorithm, my teacher claims that this runs in linear time. My problem with that statement is that we are diving the input until we reach sub-arrays with an input size of 4, and then that is sorted. So for an input size of 8, we would sort 2 sub-arrays with length 4, since 8/4 = 2. How is this in any case linear time? We are still sorting the whole input size but in blocks aren't we? This really does not make sense to me. It doesn't matter if we sort the whole input size at it is, or divide it into sub-arrays with size of 4,and sort them like that? It will still be a worst time of O(n*log(n))?

Would appreciate some explanations on this !


Solution

  • To make proving that algorithm runs in linear time, let's modify it a bit (we will only change an order of dividing and merging blocks, nothing more):

    (1): Divide input into n/4 blocks, each has size 4.
    
    (2): Until there is more than one block, repeat:
            Merge each pair of adjacent blocks into one block of size 4.
            (For example, if we have 4 blocks, we will split them in 2 pairs -
            first pair contains first and second blocks,
            second pair contains third and fourth blocks.
            After merging we will have 2 blocks -
            the first one contains 4 least elements from blocks 1 and 2,
            the second one contains 4 least elements from blocks 3 and 4).
    
    (3): The answer is the last element of that one block left.
    

    Proof: It's a fact that array of constant length (in your case, 4) can be sorted in constant time. Let k = log(n). Loop (2) runs k-2 iterations (on each iteration the count of elements left is divided by 2, until 4 elements are left).

    Before i-th iteration (0 <= i < k-2) there are (2^(k-i)) elements left, so there are 2^(k-i-2) blocks and we will merge 2^(k-i-3) pairs of blocks. Let's find how many pairs we will merge in all iterations. Count of merges equals

    mergeOperationsCount = 2^(k-3) + 2^(k-4) + .... + 2^(k-(k-3)) =
        = 2^(k-3) * (1 + 1/2 + 1/4 + 1/8 + .....) < 2^(k-2) = O(2^k) = O(n)
    

    Since we can merge each pair in constant time (because thay have constant size), and the only operation we make is merging pairs, the algorithm runs in O(n).

    And after this proof, I want to notice that there is another linear algorithm which is trivial, but it is not divide-and-conquer.