| ||

## Information theory(informationtheory)
Claude E. Shannon ( 1916 - 2001 ) has been called "the father of information theory". His theory forthe first time considered communication as a rigorously stated mathematical problem in statistics and gave communications engineers a way to determine the capacity of a communication channel in terms of the common currency of bits . The transmission part of the theory is not concernedwith the meaning ( semantics ) of the message conveyed, though the complementarywing of information theory concerns itself with content through lossycompression of messages subject to a fidelity criterion . These two wings of information theory are joined together and mutually justified by the information transmission theorems , or source-channel separation theorems that justify the use of bits as the universal currencyfor information in many contexts.
It is generally believed that the modern discipline of information theory began with the publication of Shannon's article "TheMathematical Theory of Communication" in the
(where Recently however, it has emerged that entropy was defined and used during the second world war by Alan Turing at Bletchley Park . Turing named it'weight of evidence' and measured it in units called bans and decibans. Turing and Shannon collaborated during the war but itappears that they independently created the concept. (References are given in Alan Turing:The Enigma by Andrew Hodges.) Entropy as defined by Shannon is closely related to entropy as defined by physicists . Boltzmann and Gibbs did considerable work on statistical thermodynamics . This work was the inspiration for adopting the term entropy ininformation theory. There are deep relationships between entropy in the thermodynamic and informational senses. For instance, Maxwell's demon needs information to reverse thermodynamic entropyand getting that information exactly balances out the thermodynamic gain that the demon would otherwise achieve.
Among other useful measures of information is
mutualinformation
, a measure of the
correlation
between two
random variables
. Mutual information is defined for two events
where
and
Mutual information is closely related to the
log-likelihood ratio test
for
multinomials
and to
Pearson's χ Shannon information is appropriate for measuring uncertainty over an unordered space. An alternative measure of informationwas created by Fisher for measuring uncertainty over an ordered space. For example, Shannon information is used over the space ofalphabetic letters, as letters do not have 'distances' between them. For information about the value of a continuous parametersuch as a person's height, Fisher information is used, as estimated heights do have a well-defined distance. Differences in Shannon information correspond to a special case of the Kullback-Liebler divergence of Bayesian statistics , a measure of the distance between the prior andposterior probability distributions. A. N. Kolmogorov introduced an information measure that is basedon the shortest algorithm that can recreate it; see Kolmogorov complexity . ## See also- James Tenney
- Information geometry
- Algorithmic information theory
- Fisher information
- Important publications in information theory
## References- Claude E. Shannon, Warren Weaver.
*The Mathematical Theory of Communication.*Univ of Illinois Press, 1963. ISBN 0252725484
## External links- Claude E. Shannon's original paper
- On-line textbook: Information Theory, Inference, and Learning Algorithms , by David MacKay - gives an entertaining andthorough introduction to Shannon theory, including state-of-the-art methods from communication theory, such as arithmetic coding , low-density parity-check codes , and Turbocodes .
infrmation theory, entropy, information tehory, measure, informaion theory, mathematical, information theori, work, informtion theory, related, ifnormation theory, source, nformation theory, two, information hteory, bits, informatin theory, probability, informaiton theory, transmission, niformation theory, demon, inforamtion theory, engineers, information theoyr, communications, information thery, gives, informashun theory, message, informatoin theory, instance, inforation theory, distance, , taken, informatio theory, alan, infromation theory, test, informtaion theory, called, information heory, closely, informatio ntheory, kolmogorov, informationtheory, war, infomration theory, observing, information theroy, conditioned, informationt heory, multinomials, iformation theory, appropriate, inormation theory, likelihood, information thory, inx, informatino theory, observations, informaton theory, amount, information thoery, term, inofrmation theory, inspiration, information teory, deep, infomation theory, relationships, information theoy, considerable, information theor, enigma... This article is completely or partly from Wikipedia - The Free Online Encyclopedia. Original Article. The text on this site is made available under the terms of the GNU Free Documentation Licence. We take no responsibility for the content, accuracy and use of this article. Anoca.org Encyclopedia 0.01s |