This section contains 771 words (approx. 3 pages at 300 words per page) |
While researching methods for how to more efficiently transmit information over noisy communications channels, Claude Shannon, an electrical engineer, published "A Mathematical Theory of Communication" in 1948, which spawned two disciplines--information theory and coding theory. Shannon's paper captured the basic mathematical principles relevant to transmitting, receiving, and processing information via unspecified communications media, as well as fundamentally redefining how communications engineers and specialists perceive information. At its essence, information theory is a combination of elements of communications theory, probability, and statistics. Beyond providing a means to numerically measure the quantity of information to be transmitted, information theory also encompasses how the information is to be represented (that is, coded) during transmission, as well as the capacity of the communications system to transmit, receive, process, and store the information.
At its most basic level, a communications system consists of a message source (such as a telegraph, broadcast television...
This section contains 771 words (approx. 3 pages at 300 words per page) |