Information theory
Information theory is a branch of applied mathematics, electrical engineering, and computer science involving the quantification of information.
(TO DO: expand, organize, cross-reference, illustrate.)
Description
Information theory was developed by Claude E. Shannon to find fundamental limits on signal processing operations such as:
Applications
Since its inception it has broadened to find applications in many other areas, including:
- Statistical inference
- Natural language processing
- Cryptography
- Neurobiology
- The evolution and function of molecular codes
- Model selection in ecology
- Thermal physics
- Quantum computing
- Linguistics
- Plagiarism detection
- Pattern recognition
- Anomaly detection
- Other forms of data analysis
Entropy
A key measure of information is entropy, which is usually expressed by the average number of bits needed to store or communicate one symbol in a message.
Entropy quantifies the uncertainty involved in predicting the value of a random variable. For example, specifying the outcome of a fair coin flip (two equally likely outcomes) provides less information (lower entropy) than specifying the outcome from a roll of a die (six equally likely outcomes).
See also
- Artificial intelligence
- Computing
- Entropy
- Information science
- Pattern recognition
- Shannon, Claude E.
- Telecommunication
- Units of information
External links
- Information theory @ Wikipedia