computer-scienceasymptotic-complexitylittle-o

If f(n) = o(g(n)) , then is 2^(f(n)) = o(2^(g(n)))?


Notice that I am asking for little-o here (see similar question here) - for big Oh it's clearly wrong - for little-o it feels right but can't seem to prove it...

EDIT: glad I raised a debate :) Assume f,g > 0 for simplicity


Solution

  • It is, at least if g(n) is converging to positive infinity for n to positive infinity (if g(n) isn't there are easy to find counterexamples).

    Sketch of a proof:

    Prerequsites: g(n) is converging to positive infinity for n to positive infinity.

    f(n) in o(g(n)) means:

    for every eps > 0 exists a n0 so that for all n > n0 abs(f(n)) < abs(g(n)*eps).
    

    Form this follows:

    2^abs(f(n)) < 2^abs(g(n)*eps) = (2^eps)^g(n) (for n > n0).
    

    For eps < 1:

    (2^eps)^n is in o(2^n) (as 2^eps < 2) 
    

    it follows that for every eps2 > 0 exists a n1 so that for all n > n1

    (2^eps)^n < eps2*(2^n).
    

    Choosing eps2 = eps vields:

    (2^eps)^n < eps*(2^n) for all n > n1 (n1 is dependent on eps)
    

    Because g(n) -> pos. inf. for n -> pos. inf. there exists a n2 so that

    g(n) > n1 for all n > n2
    

    Following from there is

    (2^eps)^g(n) < eps*2^g(n) for all n > n2.
    

    So for every 0 < eps < 1 there exists a n3 >= max(n0,n2) so that

    2^abs(f(n)) < 2^abs(g(n)*eps) = (2^eps)^g(n) < eps*2^g(n) for all n > n3.
    

    For every eps3 > 1 also:

    2^abs(f(n)) < eps*2^g(n) < eps3*2^g(n)
    

    So for every eps > 0 there exists a n3 so that

    2^abs(f(n)) < eps*2^g(n) for all n > n3
    

    Becase 2^f(n) < 2^abs(f(n)) for all n, and 2^x > 0 for all real x, it follows

    abs(2^f(n)) < abs(eps*2^g(n)) for all n > n3
    

    q.e.d.

    If something is unclear or wrong, please comment.

    EDIT: Some thoughts on other g(n):

    A subsequence of g(n) is restricted i.e. it exists some x so that there isn't an n0 with abs(g(n))>=x for all n > n0:

    o(g(n)) consists only of functions that are constant 0 after some n or converge to 0. Then 2^g(n) also has a restricted subsequence, but 2^f(n) is constant 1 after some point.

    There is no n0 so g(n) > 0 for all n > n0:

    2^g(n) < 1 if g(n) < 0, so g(n) has a restricted subsequence meaning o(2^g(n)) consists only of functions that are constant 0 after some n or converge to 0.