algorithmdata-structureslinked-listtime-complexitydecrease-key

What is a decrease key operation for doubly linked list?


I came across this data structure description on Gate overflow:

𝑁 items are stored in a sorted doubly linked list. For a delete operation, a pointer is provided to the record to be deleted. For a decrease-key operation, a pointer is provided to the record on which the operation is to be performed.

I don't understand what this decrease-key operation does. What does it do, and what is its time complexity?


Solution

  • It is important to realise that the linked list is sorted and is supposed to stay sorted after every manipulation.

    The key of a node (a "record") is the node's value by which the list is sorted. A node might have other data (payload), but that is irrelevant for any of the actions that are listed.

    Here is an example of such a sorted doubly linked list, and a given pointer to a node:

    β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”    β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”    β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”    β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”    β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
    β”‚ key: 1   β”‚    β”‚ key: 2   β”‚    β”‚ key: 2   β”‚    β”‚ key: 2   β”‚    β”‚ key: 3   β”‚
    β”‚   next───┼───►│   next───┼───►│   next───┼───►│   next───┼───►│   next   β”‚
    β”‚   prev   │◄───┼───prev   │◄───┼───prev   │◄───┼───prev   │◄───┼───prev   β”‚
    β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜    β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜    β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜    β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜    β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜
                                                          β–²
                                                          β”‚
                                                     given pointer
    

    If we perform a decrease key operation on the referenced node, then not only should its key decrease -- the node should also be moved to its new, sorted position:

    β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”    β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”    β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”    β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”    β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
    β”‚ key: 1   β”‚    β”‚ key: 1   β”‚    β”‚ key: 2   β”‚    β”‚ key: 2   β”‚    β”‚ key: 3   β”‚
    β”‚   next───┼───►│   next───┼───►│   next───┼───►│   next───┼───►│   next   β”‚
    β”‚   prev   │◄───┼───prev   │◄───┼───prev   │◄───┼───prev   │◄───┼───prev   β”‚
    β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜    β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜    β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜    β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜    β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜
                          β–²
                          β”‚
                     moved node
    

    The work to remove the node and insert it again has in itself a constant time complexity, but the work to find the node's new location has not.

    In the worst case we would get a (long) list where each node has the same key value, and we get a reference to the last node. The decrease key operation then has to traverse the list from that last node to the first node. Since we only get one reference to a node, all we can do is follow the prev links until we arrive at the spot where the node should end up. This is O(n) in the worst case.


    Here I assume that a decrease-key operation will decrease the key with 1. It can also be extended to mean that a parameter is given specifying how much should be subtracted from the key. The reasoning however remains the same: the algorithm will have to search backwards in the list, starting from the given node reference, in order to find the right insertion spot for this node, based on its updated key value.