Information and Information Theory - Research Article from World of Computer Science

This encyclopedia article consists of approximately 5 pages of information about Information and Information Theory.

Information and Information Theory - Research Article from World of Computer Science

This encyclopedia article consists of approximately 5 pages of information about Information and Information Theory.
This section contains 1,411 words
(approx. 5 pages at 300 words per page)
Buy the Information and Information Theory Encyclopedia Article

In 1948 Claude Shannon published a seminal paper, "A Mathematical Theory of Communication," in the Bell System Technical Journal, inaugurating the body of concepts now known as information theory. In his paper, Shannon defined a precise quantitative measure for the amount of "information" contained in any message and proved important mathematical theorems about the fundamental limits at which one can communicate information. To state and prove his theorems, Shannon used a probabilistic model of a communication system which consists of a source, say X, that generates symbols belonging to some predefined alphabet, which are to be transmitted over an imperfect communication channel to a decoder, Y, which then hopefully reproduces the original message (string of alphabetic symbols).

In this model of communication, an important characteristic of the source is its entropy, H(X). If the source X emits N different symbols xi...

(read more)

This section contains 1,411 words
(approx. 5 pages at 300 words per page)
Buy the Information and Information Theory Encyclopedia Article
Copyrights
Gale
Information and Information Theory from Gale. ©2005-2006 Thomson Gale, a part of the Thomson Corporation. All rights reserved.