This section contains 4,505 words (approx. 16 pages at 300 words per page) |
Among the more interesting trends of the past half-century has been the consolidation of probability, statistics, combinatorial optimization, information theory, and computer science into a single imposing discipline of infomatics.
Minimal Belief Change
Of special philosophical interest is the enrichment of Bayesian inference by a rule of belief change that goes by minimizing the distance between two probability distributions, P = (p1, … , pk and Q = (q1, … , q2), as measured by the expected log-likelihood ratio:
(1) 0A0;
The likelihood ratio, P(e|h):P(e|k), is a fundamental index of the support that e accords h over k (see the entry "Foundations of Statistics").
Using the visually transparent Gibbs inequality,
(2) ln x ≤ x − 1
with equality if and only if (iff) x = 1, in the equivalent form ln x ≥ 1 − 1/x, it follows that H(P, Q) ≥ 0 with equality iff P = Q. Notice, however, that H(P, Q) ≠ H...
This section contains 4,505 words (approx. 16 pages at 300 words per page) |