algorithmtime-complexitycomputability

Why is it assumpted that the time-complexity of multiplication by n is constant?


Regardless of how the multiplication(or division) operation is implemented(i.e. whether it is a software function or a hardware instruction), it won't be solvable in time O(1). for big n values, the processor cannot even compute it by one instruction.

In such algorithms, why are these operations constant and not depended to n?

for (i = 1; i <= n; i++) {
    j = n;
    while (j > 1)
        j = j / 3;    //constant operation
}

Solution

  • Time complexity is not a measure of time. It's a measure of "basic operations" which can be defined how you want. Often, any arithmetic operation is considered a basic operation. Sometimes (for example, when considering the time complexity of sorting algorithms, or hash table operations), the basic operations are comparisons. Sometimes, "basic operations" are operations on single bits (in which case j=j/3 would have time complexity O(log(j)).

    The rules that tend to be followed are: