This section contains 988 words (approx. 4 pages at 300 words per page) |
Information is the fundamental concept of information theory, the mathematical study of symbolic communication. Information theory was founded in 1948 by one seminal paper, Claude Shannon's "The Mathematical Theory of Communication," and has since proved essential to the development of computers, telecommunications, digital music recordings, and many other technologies of the "Information Age." It has also been applied fruitfully in cryptography, genetics, linguistics, and other disciplines.
Shannon, a quirky mathematician and electrical engineer famous for juggling while riding his unicycle down the hallways of Bell Laboratories in the 1940s and 50s, was not the first theorist to ponder the subject of "information." He was, however, the first to define the term rigorously and to specify its unit of measure--the now-famous "bit" (short for "binary digit").
Shannon defined "information" in what might at first seem an odd way: not as a substance that can exist in fixed quantities, although this...
This section contains 988 words (approx. 4 pages at 300 words per page) |