Difference between revisions of "Information theory"
Karl Jones (Talk | contribs) |
Karl Jones (Talk | contribs) (→See also) |
||
Line 35: | Line 35: | ||
* [[Artificial intelligence]] | * [[Artificial intelligence]] | ||
+ | * [[Bandwidth (computing)]] | ||
* [[Computing]] | * [[Computing]] | ||
* [[Entropy]] | * [[Entropy]] |
Revision as of 07:26, 17 February 2016
Information theory is a branch of applied mathematics, electrical engineering, and computer science involving the quantification of information.
Description
Information theory was developed by Claude E. Shannon to find fundamental limits on signal processing operations such as:
Applications
Since its inception it has broadened to find applications in many other areas, including:
- Statistical inference
- Natural language processing
- Cryptography
- Neurobiology
- The evolution and function of molecular codes
- Model selection in ecology
- Thermal physics
- Quantum computing
- Linguistics
- Plagiarism detection
- Pattern recognition
- Anomaly detection
- Other forms of data analysis
Entropy
A key measure of information is entropy, which is usually expressed by the average number of bits needed to store or communicate one symbol in a message.
Entropy quantifies the uncertainty involved in predicting the value of a random variable. For example, specifying the outcome of a fair coin flip (two equally likely outcomes) provides less information (lower entropy) than specifying the outcome from a roll of a die (six equally likely outcomes).
See also
- Artificial intelligence
- Bandwidth (computing)
- Computing
- Entropy
- Information science
- Pattern recognition
- Shannon, Claude E.
- Telecommunication
- Units of information
External links
- Information theory @ Wikipedia