This section contains 1,411 words (approx. 5 pages at 300 words per page) |
In 1948 Claude Shannon published a seminal paper, "A Mathematical Theory of Communication," in the Bell System Technical Journal, inaugurating the body of concepts now known as information theory. In his paper, Shannon defined a precise quantitative measure for the amount of "information" contained in any message and proved important mathematical theorems about the fundamental limits at which one can communicate information. To state and prove his theorems, Shannon used a probabilistic model of a communication system which consists of a source, say X, that generates symbols belonging to some predefined alphabet, which are to be transmitted over an imperfect communication channel to a decoder, Y, which then hopefully reproduces the original message (string of alphabetic symbols).
In this model of communication, an important characteristic of the source is its entropy, H(X). If the source X emits N different symbols xi...
This section contains 1,411 words (approx. 5 pages at 300 words per page) |