time-complexitycomplexity-theory

Isn't every algorithm technically ω(0)?


Isn't every function is ω(0)?

From my understanding, ω (little omega) notation is used as a strict lower bound for any function. Since no function can run in less than 0 "time" then ω(0) must be a valid strict lower bound for every function.

From my understanding is like saying x^2=O(x^3), which is technically right.

I know that this isn't useful as ω(0) doesn't help us at all describe the answer, but isn't it technically right to say that every function is ω(0).

I searched on the internet and didn't find anything, and when asking chatGPT it just told me I was wrong, and didn't address my point directly.


Solution

  • Isn't every function is ω(0)?

    Since we are measuring execution time of algorithms, we can assume that the functions f that are interesting are non-zero and positive ones.

    Under that assumption: yes it is true that every such function f ∈ ω(0).

    The formal definition for ω notation is:

    f(n) ∈ ω(g(n)) if and only if: for all k>0, there exists n0, such that for all n>n0 : f(n)>k*g(n).

    Our g(n) is 0.
    For every k>0 we can choose e.g. n0=0, and then we will have:
    f(n) > k*g(n) for every n>0 (because k*g(n) is k*0 which is 0, and f is assumed to be positive).
    I.e. our positive f satisfies the definition to be ∈ ω(0).

    Having said all that, I don't any case where this analysis is actually useful.