algorithmtime-complexitybig-ocomplexity-theory

Are there any alternatives to Asymptotic Notation?


I found this definition :

Asymptotic Notations are languages to analyze an algorithm's running time by identifying its behavior as the input size for the algorithm increases. This is also known as an algorithm's growth rate.

This got me thinking, are there any other notations, or is there even a possibility of having any metric, other than input size, to analyze an algorithm?


Solution

  • Yes, there are many alternatives. For example, since asymptotic notation ignores leading coefficients, it’s not a great tool for reasoning about exact operation counts. However, using more precise analyses, in some cases you can pin down what the leading coefficient of an algorithm’s runtime is as a function of input size. This has applications in areas like numerical methods where inputs are huge and these leading constants matter.

    You might also look at the degree of parallelism inherent in an algorithm, which would be useful if you wanted to run the algorithm on a multicore machine. Or you might look at how much communication is required when the algorithm is parallelized, which has applications in distributed computing.

    You might also look at structural elements of how an algorithm is implemented. For code, you can look at things like the cyclomatic complexity, which measures how “complex” a piece of code is based on how many control paths exist through it. For Boolean circuits, you could look at the depth of the circuit. For sorting networks, you could look at the number of rounds in the network.

    You could also switch from looking at the size of the input to some other property of the input. The idea of fixed-parameter tractability is based on the insight that an input to an algorithm might have an “easy” part and a “hard” part, and if the “hard” part isn’t that hard you might be able to solve the problem quickly even if the input is large in a conventional sense.

    You could also analyze the algorithm’s sensitivity to tiny changes to its input. Perhaps the algorithm runs extremely quickly on most instances, but is painfully slow in others (the simplex method for linear programming is a good example of this). Tools like smoothed complexity look at precisely this.