javaqueuebig-ominmax-heap

Deleting min and max from min-max queue class in less than linear time


I'm implementing a queue class that has both a popMin and popMax method. So far I have it working with two priority queues, but even though a remove is log(n) time, I have to remove from the other queue as well which is linear. I know that a double ended priority queue can be implemented using a binary heap, but if I'm not mistaken that takes linear time to build? Is there a way I can do it more efficiently? I can only use Java library classes as well.

static class MinMaxQueue {

    PriorityQueue<String> MinQueue = new PriorityQueue<String>(); 
    PriorityQueue<String> MaxQueue = new PriorityQueue<String>(Collections.reverseOrder()); 


    void push(String val) {
        MinQueue.add(val); 
        MaxQueue.add(val); 
    }

    String popMin() {
        MaxQueue.remove(MinQueue.peek()); 
        return MinQueue.remove();
    }

    String popMax() {
        MinQueue.remove(MaxQueue.peek()); 
        return MaxQueue.remove(); 
    }

    int size() {
        return MinQueue.size(); 

    }
}

Solution

  • A min-max heap supports O(1) find-min/max, and O(log n) delete-min/max. Guava's MinMaxPriorityQueue is an implementation of min-max heap.

    There's no min-max heap implementation in java standard library. If you can't use 3rd party library, and you don't want to implement your own min-max heap. You can try TreeSet which is implemented based on Red-Black tree. It has O(logn) delete-min/max, the trade-off is find-min/max is O(logn) as well. You can try make it threaded to keep track of the min/max, but requires lots of changes to TreeSet.

    If you stick to use two PriorityQueues to implement your min-max queue. You can keep track of all the indices of elements of the other queue into a HashMap, as the answer. Because PriorityQueue#removeAt(index) is O(logn). But every element movement will requires update the HashMap. I strongly suggest not to do it because of the extra space usage and probably poor performance.