Home | Site Map |
 
Anoca.org  


Information theory

(informationtheory)





Information theory is a branch of the mathematical theoryof probability and mathematical statistics , that quantifies the concept of information . Itis concerned with information entropy , communication systems, data transmission and ratedistortion theory , cryptography , data compression , error correction ,and related topics. It is not to be confused with library and information science or information technology .

Claude E. Shannon ( 1916 - 2001 ) has been called "the father of information theory". His theory forthe first time considered communication as a rigorously stated mathematical problem in statistics and gave communications engineers a way to determine the capacity of a communication channel in terms of the common currency of bits . The transmission part of the theory is not concernedwith the meaning ( semantics ) of the message conveyed, though the complementarywing of information theory concerns itself with content through lossycompression of messages subject to a fidelity criterion .

These two wings of information theory are joined together and mutually justified by the information transmission theorems , or source-channel separation theorems that justify the use of bits as the universal currencyfor information in many contexts.

It is generally believed that the modern discipline of information theory began with the publication of Shannon's article "TheMathematical Theory of Communication" in the Bell System Technical Journal in July and October of 1948 . This work drew on earlier publicationsby Harry Nyquist and Ralph Hartley . In the process of working out a theory of communications that could be applied by electrical engineers to design better telecommunications systems,Shannon defined a measure of entropy :

(where pi is the probability of i) that, when applied to an information source, coulddetermine the capacity of the channel required to transmit the source as encoded binary digits. If the logarithm in the formula is taken to base 2, then it gives a measure of entropy in bits. Shannon's measureof entropy came to be taken as a measure of the information contained in a message, as opposed to the portion of themessage that is strictly determined (hence predictable) by inherent structures, like for instance redundancy in the structure oflanguages or the statistical properties of a language relating to the frequencies of occurrence of different letter or wordpairs, triplets etc. See Markov chains .

Recently however, it has emerged that entropy was defined and used during the second world war by Alan Turing at Bletchley Park . Turing named it'weight of evidence' and measured it in units called bans and decibans. Turing and Shannon collaborated during the war but itappears that they independently created the concept. (References are given in Alan Turing:The Enigma by Andrew Hodges.)

Entropy as defined by Shannon is closely related to entropy as defined by physicists . Boltzmann and Gibbs did considerable work on statistical thermodynamics . This work was the inspiration for adopting the term entropy ininformation theory. There are deep relationships between entropy in the thermodynamic and informational senses. For instance, Maxwell's demon needs information to reverse thermodynamic entropyand getting that information exactly balances out the thermodynamic gain that the demon would otherwise achieve.

Among other useful measures of information is mutualinformation , a measure of the correlation between two random variables . Mutual information is defined for two events X and Y as

where H(X,Y) is the joint entropy, defined as

and H(X | Y) is the conditional entropy of X conditioned on observing Y. As such, the mutual information can be intuitively considered the amount of uncertainty inX that is eliminated by observations of Y and viceversa.

Mutual information is closely related to the log-likelihood ratio test for multinomials and to Pearson's χ2 test .

Shannon information is appropriate for measuring uncertainty over an unordered space. An alternative measure of informationwas created by Fisher for measuring uncertainty over an ordered space. For example, Shannon information is used over the space ofalphabetic letters, as letters do not have 'distances' between them. For information about the value of a continuous parametersuch as a person's height, Fisher information is used, as estimated heights do have a well-defined distance.

Differences in Shannon information correspond to a special case of the Kullback-Liebler divergence of Bayesian statistics , a measure of the distance between the prior andposterior probability distributions.

A. N. Kolmogorov introduced an information measure that is basedon the shortest algorithm that can recreate it; see Kolmogorov complexity .

See also

References

  • Claude E. Shannon, Warren Weaver. The Mathematical Theory of Communication. Univ of Illinois Press, 1963. ISBN 0252725484

External links




infrmation theory, entropy, information tehory, measure, informaion theory, mathematical, information theori, work, informtion theory, related, ifnormation theory, source, nformation theory, two, information hteory, bits, informatin theory, probability, informaiton theory, transmission, niformation theory, demon, inforamtion theory, engineers, information theoyr, communications, information thery, gives, informashun theory, message, informatoin theory, instance, inforation theory, distance, , taken, informatio theory, alan, infromation theory, test, informtaion theory, called, information heory, closely, informatio ntheory, kolmogorov, informationtheory, war, infomration theory, observing, information theroy, conditioned, informationt heory, multinomials, iformation theory, appropriate, inormation theory, likelihood, information thory, inx, informatino theory, observations, informaton theory, amount, information thoery, term, inofrmation theory, inspiration, information teory, deep, infomation theory, relationships, information theoy, considerable, information theor, enigma...


This article is completely or partly from Wikipedia - The Free Online Encyclopedia. Original Article. The text on this site is made available under the terms of the GNU Free Documentation Licence. We take no responsibility for the content, accuracy and use of this article.

Anoca.org Encyclopedia
0.01s