algorithmrecursionmathcomputer-scienceclrs

How can a never-ending recursive function have a time complexity?


In the book "Introduction to Algorithms" by CLRS we are asked to find the time complexity of a recursive function:

4.4-8
Use a recursion tree to give an asymptotically tight solution to the recurrence T(n) = T(n - a) + T(a) + cn where a ≥ 1 and c > 0 are constants.

However, when T(a) is called, T(a) will be called again, which will call T(a) again, and so on. There will never be a base case for this branch. The function will therefore never end! How can this function then have a time complexity of O(n^2) when it actually will result in O(∞)?

         n 
        / \
       /   \
   n - a     a
   / \      / \
  /   \    /   \
n-2a   a  0     a  <-- Never ending
/ \    /\      / \
      0  a     0  a
                   \  <-- No base case

Proof for O(n^2) Link, Link, Link:
Is this a case where the mathematical proof dosen't match reality or have I misinterpret what the function actually mean? To clarify, I do not ask how the mathematical proof works, I just don't get it how it could be the right answer with the logic I have described. Moreover what does O(n^2) mean to this function, when every n will result in a never ending function as long as a > 0, which according to the question always is the case?


Solution

  • When recurrence relations are given for the purpose of determining complexity, the base case(s) are often just left out.

    Assuming that the program terminates, then if x is any constant, T(x) is also a constant. You can just replace it with a constant like "d", or remember that you don't have to expand it.

    Since it doesn't matter what the constant is, it will often be written as O(1), even though that is formally incorrect, because O(1) is a set. It's also not appropriate for you, since you are looking for a tight bound.