Difference between revisions of "Information theory"
Karl Jones (Talk | contribs) |
Karl Jones (Talk | contribs) |
||
Line 3: | Line 3: | ||
== Description == | == Description == | ||
− | It was originally proposed by [[Claude | + | It was originally proposed by [[Claude Shannon]] in 1948 to find fundamental limits on [[signal processing]] and communication operations such as [[data compression]], in a landmark paper entitled "A Mathematical Theory of Communication". |
Now this theory has found applications in many other areas, including [[statistical inference]], [[natural language processing]], [[cryptography]], [[neurobiology]], the evolution and function of molecular codes, model selection in [[ecology]], [[thermal physics]], [[quantum computing]], [[linguistics]], [[plagiarism detection]], [[pattern recognition]], and [[anomaly detection]]. | Now this theory has found applications in many other areas, including [[statistical inference]], [[natural language processing]], [[cryptography]], [[neurobiology]], the evolution and function of molecular codes, model selection in [[ecology]], [[thermal physics]], [[quantum computing]], [[linguistics]], [[plagiarism detection]], [[pattern recognition]], and [[anomaly detection]]. | ||
Line 33: | Line 33: | ||
* [[Artificial intelligence]] | * [[Artificial intelligence]] | ||
* [[Bandwidth (computing)]] | * [[Bandwidth (computing)]] | ||
− | * [[Claude | + | * [[Claude Shannon]] |
* [[Computing]] | * [[Computing]] | ||
* [[Entropy]] | * [[Entropy]] |
Latest revision as of 17:40, 10 September 2016
Information theory studies the quantification, storage, and communication of information.
Contents
Description
It was originally proposed by Claude Shannon in 1948 to find fundamental limits on signal processing and communication operations such as data compression, in a landmark paper entitled "A Mathematical Theory of Communication".
Now this theory has found applications in many other areas, including statistical inference, natural language processing, cryptography, neurobiology, the evolution and function of molecular codes, model selection in ecology, thermal physics, quantum computing, linguistics, plagiarism detection, pattern recognition, and anomaly detection.
Entropy
A key measure in information theory is entropy. Entropy quantifies the amount of uncertainty involved in the value of a random variable or the outcome of a random process. For example, identifying the outcome of a fair coin flip (with two equally likely outcomes) provides less information (lower entropy) than specifying the outcome from a roll of a die (with six equally likely outcomes).
Other measures
Some other important measures in information theory are mutual information, channel capacity, error exponents, and relative entropy.
Applications
Applications of fundamental topics of information theory include lossless data compression (e.g. ZIP files), lossy data compression (e.g. MP3s and JPEGs), and channel coding (e.g. for Digital Subscriber Line (DSL)).
The field is at the intersection of mathematics, statistics, computer science, physics, neurobiology, and electrical engineering. Its impact has been crucial to the success of the Voyager missions to deep space, the invention of the compact disc, the feasibility of mobile phones, the development of the Internet, the study of linguistics and of human perception, the understanding of black holes, and numerous other fields.
Important sub-fields of information theory include source coding, channel coding, algorithmic complexity theory, algorithmic information theory, information-theoretic security, and measures of information.
Entropy
A key measure of information is entropy, which is usually expressed by the average number of bits needed to store or communicate one symbol in a message.
Entropy quantifies the uncertainty involved in predicting the value of a random variable. For example, specifying the outcome of a fair coin flip (two equally likely outcomes) provides less information (lower entropy) than specifying the outcome from a roll of a die (six equally likely outcomes).
See also
- Artificial intelligence
- Bandwidth (computing)
- Claude Shannon
- Computing
- Entropy
- Error detection and correction
- Information science
- Pattern recognition
- Telecommunication
- Units of information
External links
- Information theory @ Wikipedia