Time complexity

From Wiki @ Karl Jones dot com
Jump to: navigation, search

In computer science, the time complexity of an algorithm quantifies the amount of time taken by an algorithm to run as a function of the length of the string representing the input.

The time complexity of an algorithm is commonly expressed using big O notation, which excludes coefficients and lower order terms. When expressed this way, the time complexity is said to be described asymptotically, i.e., as the input size goes to infinity.

Since an algorithm's performance time may vary with different inputs of the same size, one commonly uses the worst-case time complexity of an algorithm, denoted as T(n), which is defined as the maximum amount of time taken on any input of size n.

Less common, and usually specified explicitly, is the measure of average-case complexity.

Time complexities are classified by the nature of the function T(n).

See also

External links

  • Time complexity