complexity-theoryiterated-logarithm

Complexity of iterated logarithm on base 2


Assuming iterated logarithm is defined as it is here: http://en.wikipedia.org/wiki/Iterated_logarithm

How should I go about comparing its complexity to other functions, for example lg(lg(n))? So far I've done all the comparing by calculating the limits, but how do you calculate a limit of iterated logarithm?

I understand it grows very slowly, slower than lg(n), but is there some function that grows at the same speed maybe as lg*(n) (where lg* is iterated logarithm on base 2) so it would ease comparing it to other functions? This way I could also compare lg*(lg(n)) to lg(lg*(n)) for example. Any tips on comparing functions to each other based on speed of growing would be appreciated.


Solution

  • The iterated logarithm function log* n isn't easily compared to another function that has similar behavior, the same way that log n isn't easily compared to another function with similar behavior.

    To understand the log* function, I've found it helpful to keep a few things in mind:

    1. This function grows extremely slowly. Figure that log* 22222 = 5, and that tower of 2s is a quantity that's bigger than the number of atoms in the known universe.
    2. It plays with logarithms the way that logarithms play with division. The quotient rule for logarithms says that log (n / 2) = log n - log 2 = log n - 1, assuming our logs are base-2. One intuition for this is that, in a sense, log n represents "how many times do you have to divide n by two to drop n down to 1?," so log (n / 2) "should" be log n - 1 because we did one extra division beforehand. On the other hand, (log n) / 2 is probably a lot smaller than log n - 1. Similarly, log* n counts "how many times do you have to take the log of n before you drop n down to some constant?" so log* log n = log* n - 1 because we've taken one additional log. On the other hand, log log* n is probably much smaller than log* n - 1.

    Hope this helps!