algorithmtime-complexitybig-o

What's the difference between O(n) and O(n+n^1/2) in algorithm?


I watched an online course which mentioned that O(n) and O(n + n^(1/2)) are equivalent in algorithms.

I'm curious about why the n^(1/2) part can be ignored. What is the reason for this omission? I also want to know in what other situations a part of a polynomial can be ignored. Thanks a lot!


Solution

  • Roughly speaking, big-O constitutes an asymptotic upper-bound on functions.

    O(n + n^(1/2)) can be upper-bounded by O(n + n^1), which is O(n + n), which is O(2*n).

    And since contstants are not relevant for big-O analysis, this is the same as O(n).

    If you are not sure why constants are not relevant, you can review the definition for big-O.
    In a nut shell:
    f(x) is in O(g(x)) if and only if there exists any constant c, such that for big enough n, it holds that c*g(n) >= f(n).
    Since the definition allows to use multiplication of any constant c by g to bound f, we can always choose a big enough c such that constants in f and g do not matter.

    A side note regarding terminology:
    A Polynomial can only contain non-negative integer powers of the variable, and so an expression containing n^(1/2) is not really a polynomial.