algorithmtime-complexitybinary-treetraversal

Prove that k successive calls to TREE-SUCCESSOR take O(k + h) time


This question has been around for while, but I still have a question please. I will discuss the following solution.

Question: Prove that no matter what node we start at in a height-h binary search tree, k successive calls to TREE-SUCCESSOR take O(k + h) time.

Given tree successor algorithm below, that given an x, it will find it's left-most node of right-subtree of x:

enter image description here

Assumptions: Assuming that the worst-case running time of TREE-SUCCESSOR() is O(h), where h is the height of binary search tree and the tree will not be changed between successive calls of the method. The worst case happens when next successor is a leaf at depth h.

You need to call once the method to get the next successor and you need O(h) time for that. Once when you have the next successor you simply can store it and for every other call, you can return it in O(1). Since, you have k−1 remaining calls, the running time is O(k). Combining with the above, you have that the total running time is O(h)+O(k)=O(k+h).

Question: I am not sure by the last statement about how we got running time O(k) please? I assume k successive calls to TREE-SUCCESSOR() means that we are calling it k times on probably same or different nodes? If that is the case, we know that worst time for TREE-SUCCESSOR is O(h), so how we got a total time of O(h+k) please? I can see that if we call TREE-SUCCESSOR() k times, then we should get worst-case of O(kh).

Edit: I would assume probably that k succissive calls to TREE-SUCCESSOR() means: TREE-SUCCESSOR()+⋯+TREE-SUCCESSOR(): k times?


Solution

  • It's commonly known that an in-order traversal of a binary tree takes O(n) time using this method. Your problem is to prove the equivalent statement for partial traversals of a binary tree.

    A weaker version of the statement you have to prove is that the amortized cost of each successive call is O(1). I would prove that statement using the potential method. A similar method can be used for the proof you need:

    To show that k successive calls require O(k+h) time, we find a "bank function" B that operates on the current node, and show that:

    1. The time required for y = TREE-SUCCESSOR(x) is in O(1 + B(y) - B(x)); and
    2. 0 <= B(x) <= h for all x

    It follows from (1) that after k successive calls such that y = TREE-SUCCESSORk(x), the total time is in O(k + B(y)-B(x)), and it follows from(2) that B(y)-B(x) is in O(h), so this proves that the time is in O(k + h)

    A bank function that works is B(x) = (h + dL(x) - dR(x))/2, where dL is the number of left-child links on the path from root to x, and dR is the number of right-child links on the path from the root to x.

    Consider the cases of non-bounded work in TREE-SUCCESOR(x):

    1. When x has a right child, the extra work is proportional to the number of left-child links added in TREE-MINIMUM. That is the change in dL. The number of right-child links increases by only 1, so the extra work is proportional to the change in B.
    2. When x has no right child, the extra work is proportional to the number of right-child links we remove by traversing to the parent. That is the decrease in dR. The number of left-child links changes by only 1, so again the extra work is proportional to the change in B.

    Those are the only two cases, and in both cases we have total work in O(1 + B(y) - B(x)). Condition 1 is true. Condition 2 is also trivially true from the definition of B, so the statement is proven.