Difference between revisions of "Information theory"

From Wiki @ Karl Jones dot com
Jump to: navigation, search
(See also)
Line 38: Line 38:
 
* [[Computing]]
 
* [[Computing]]
 
* [[Entropy]]
 
* [[Entropy]]
 +
* [[Error detection and correction]]
 
* [[Information science]]
 
* [[Information science]]
 
* [[Pattern recognition]]
 
* [[Pattern recognition]]

Revision as of 12:13, 6 September 2016

Information theory is a branch of applied mathematics, electrical engineering, and computer science involving the quantification of information.

Description

Information theory was developed by Claude E. Shannon to find fundamental limits on signal processing operations such as:

Applications

Since its inception it has broadened to find applications in many other areas, including:

Entropy

A key measure of information is entropy, which is usually expressed by the average number of bits needed to store or communicate one symbol in a message.

Entropy quantifies the uncertainty involved in predicting the value of a random variable. For example, specifying the outcome of a fair coin flip (two equally likely outcomes) provides less information (lower entropy) than specifying the outcome from a roll of a die (six equally likely outcomes).

See also

External links