This section contains 817 words (approx. 3 pages at 300 words per page) |
The complexity of an algorithm is often described using what is called "big-O notation." Big-O notation is a theoretical measure of how an algorithm will execute in terms of the time or computer memory required given the size of the problem itself. The size of the problem is typically measured by the number of items in it.
This measure is extremely important when trying to decide if an algorithm is going to be sufficient to solve a problem of a given size no matter how much time and money are spent in building computers powerful enough to run it. Often algorithms that are theoretically possible require resources that would bankrupt the nation to implement.
Algorithms are often described as having "constant," "n log n complexity," "linear," "logarithmic," "polynomial," "exponential," or "factorial" complexity, and these terms denote how the average running time of the algorithm...
This section contains 817 words (approx. 3 pages at 300 words per page) |