TIMECOMPLEXITY
Time complexity
In computer science, the time complexity of an algorithm quantifies the amount of time taken by an algorithm to run as a function of the length of the string representing the input. The time complexity of an algorithm is commonly expressed using big O notation, which excludes coefficients and lower order terms. When expressed this way, the time complexity is said to be described asymptotically, i.e., as the input size goes to infinity. For example, if the time required by an algorithm on all inputs of size n is at most, the asymptotic time complexity is O.The above text is a snippet from Wikipedia: Time complexity
and as such is available under the Creative Commons Attribution/Share-Alike License.
time complexity
Noun
- the amount of time an algorithm requires to run, as a function of the amount of input, measured in such a way as to ignore constant terms and multiplication by constant terms
- Classical computers cannot sort a list of size <math>n</math> in less than <math>O(n\,\log\, n)</math> time
The above text is a snippet from Wiktionary: time complexity
and as such is available under the Creative Commons Attribution/Share-Alike License.