recursiontime-complexityrecurrence

Constant in a recurrence relation


Consider the code below for finding n to the power p for p >= 0 and time for each step.

int pow(int n, int p){  // T(p)
    if (n == 1 || p == 0){  // O(1)
        return 1;
    }
    return n * pow(n, p - 1); // n * T(p - 1)
}

Now, I wrote the recurrence relation for this function as T(p) = n*T(p - 1) + O(1).

I know this is incorrect as the solution I found to this relation came out to be O(n^p). I think constant 'n' won't appear in multiplication in relation as then time complexity will be O(n) but I am confused why it will not appear in that as it is multiplied in the code. Can you please explain that?


Solution

  • n*T(p - 1) would be correct if the algorithm would evaluate the statement return n * pow(n, p - 1) as follows:

    temp = 1
    for i from 1 to n:
        temp = temp + pow(n, p - 1)
    return temp
    

    But that is not what happens. The recursive call pow(n, p - 1) is only made once for evaluating n * pow(n, p - 1), so in terms of "work" it only counts once, not 𝑛 times.

    The cost of multiplication

    The above explains why you should not multiply by 𝑛 to determine the overall time complexity, but I should mention that there is a cost to perform the multiplication of two numbers, that may or may not be ignored.

    There are two approaches for determining the time complexity of an arithmetic operation like multiplication:

    1. It is not uncommon to consider multiplication as an elementary operation that is assumed to finish in constant time (See Wikipedia's Time complexity), so that the overal time complexity of your algorithm is O(𝑝). This is a simplification, but makes sense when using a programming language with an integer type that has a fixed memory size (like 64 bits).

    2. A more thorough complexity analysis will reason that the multiplication of arbitrary large (big) integers takes more time the larger these operands are (See Wikipedia's Computational complexity of mathematical operations). With that approach we should take into account a time complexity of O(log𝑘) for each multiplication that gives a product that needs 𝑘 bits to be stored, and as multiplying 𝑛 with the product coming from recursion will add log𝑛 bits to the product, we get the following complexity for the whole algorithm:

      O(log𝑛 + 2log𝑛 + ... + 𝑝log𝑛) = O(𝑝²log𝑛)