This section contains 279 words (approx. 1 page at 300 words per page) |
An algorithm is a procedure involving prescribed steps that lead to a specific outcome, which is often the calculation of something. At the elementary level, this frequently refers to procedures for doing basic operations of arithmetic such as simple addition, subtraction, multiplication and division. At a more advanced level, this refers to procedures designed to solve specific problems, often arising in computer science.
The modern theory of algorithms developed in the twentieth century. It was greatly advanced when the first precise definition of a calculable function was published in 1936 by Princeton University mathematician, Alonso Church. He was joined at Princeton in 1936 by his graduate student, Alan Turing (1912-1954), a brilliant young British mathematician. Turing established the characteristics of algorithms by applying them to an abstract theoretical calculating device that became known as the Turing machine. Turing described his theory in the paper "On Computable Numbers," presented to the London Mathematical Society in 1937. Turing's idea was that a machine could carry out any solvable mathematical computation by following a clear series of instructions--an algorithm, thus developing the theoretical basis for the computer.
The field of algorithmic analysis, or computational complexity, plays an important role in computer science. Given an algorithm, the problem is to study its performance characteristics. To determine the complexity of an algorithm, computer scientists count the number of steps that the algorithm takes, as a function of the problem size, and for either the worst case or average case input. By analyzing an algorithm, a programmer can gain a better understanding of where the fast and slow parts are, and where improvements are needed to speed up the implementation of the algorithm on a computer.
This section contains 279 words (approx. 1 page at 300 words per page) |