Information Theory - Research Article from World of Mathematics

This encyclopedia article consists of approximately 3 pages of information about Information Theory.

Information Theory - Research Article from World of Mathematics

This encyclopedia article consists of approximately 3 pages of information about Information Theory.
This section contains 771 words
(approx. 3 pages at 300 words per page)
Buy the Information Theory Encyclopedia Article

While researching methods for how to more efficiently transmit information over noisy communications channels, Claude Shannon, an electrical engineer, published "A Mathematical Theory of Communication" in 1948, which spawned two disciplines--information theory and coding theory. Shannon's paper captured the basic mathematical principles relevant to transmitting, receiving, and processing information via unspecified communications media, as well as fundamentally redefining how communications engineers and specialists perceive information. At its essence, information theory is a combination of elements of communications theory, probability, and statistics. Beyond providing a means to numerically measure the quantity of information to be transmitted, information theory also encompasses how the information is to be represented (that is, coded) during transmission, as well as the capacity of the communications system to transmit, receive, process, and store the information.

At its most basic level, a communications system consists of a message source (such as a telegraph, broadcast television...

(read more)

This section contains 771 words
(approx. 3 pages at 300 words per page)
Buy the Information Theory Encyclopedia Article
Copyrights
Gale
Information Theory from Gale. ©2005-2006 Thomson Gale, a part of the Thomson Corporation. All rights reserved.