Difference between revisions of "Information theory"

From Wiki @ Karl Jones dot com
Jump to: navigation, search
(First)
 
Line 1: Line 1:
 
'''Information theory''' is a branch of applied mathematics, electrical engineering, and [[computer science]] involving the quantification of [[information]].
 
'''Information theory''' is a branch of applied mathematics, electrical engineering, and [[computer science]] involving the quantification of [[information]].
  
Information theory was developed by Claude E. Shannon to find fundamental limits on signal processing operations such as compressing data and on reliably storing and communicating data.
+
Information theory was developed by [[Claude E. Shannon]] to find fundamental limits on signal processing operations such as compressing data and on reliably storing and communicating data.
  
 
Since its inception it has broadened to find applications in many other areas, including:
 
Since its inception it has broadened to find applications in many other areas, including:
Line 7: Line 7:
 
* Statistical inference
 
* Statistical inference
 
* Natural language processing
 
* Natural language processing
* Cryptography
+
* [[Cryptography]]
 
* Neurobiology
 
* Neurobiology
 
* The evolution and function of molecular codes
 
* The evolution and function of molecular codes
Line 15: Line 15:
 
* Linguistics
 
* Linguistics
 
* Plagiarism detection
 
* Plagiarism detection
* Pattern recognition
+
* [[Pattern recognition]]
* Anomaly detection
+
* [[Anomaly detection]]
* Other forms of data analysis
+
* Other forms of [[data analysis]]
  
 
== Entropy ==
 
== Entropy ==
Line 24: Line 24:
  
 
Entropy quantifies the uncertainty involved in predicting the value of a random variable. For example, specifying the outcome of a fair coin flip (two equally likely outcomes) provides less information (lower entropy) than specifying the outcome from a roll of a die (six equally likely outcomes).
 
Entropy quantifies the uncertainty involved in predicting the value of a random variable. For example, specifying the outcome of a fair coin flip (two equally likely outcomes) provides less information (lower entropy) than specifying the outcome from a roll of a die (six equally likely outcomes).
 +
 +
== See also ==
 +
 +
* [[Information science]]
  
 
== External links ==
 
== External links ==
  
 
* [https://en.wikipedia.org/wiki/Information_theory Information theory] @ Wikipedia
 
* [https://en.wikipedia.org/wiki/Information_theory Information theory] @ Wikipedia

Revision as of 08:30, 29 August 2015

Information theory is a branch of applied mathematics, electrical engineering, and computer science involving the quantification of information.

Information theory was developed by Claude E. Shannon to find fundamental limits on signal processing operations such as compressing data and on reliably storing and communicating data.

Since its inception it has broadened to find applications in many other areas, including:

Entropy

A key measure of information is entropy, which is usually expressed by the average number of bits needed to store or communicate one symbol in a message.

Entropy quantifies the uncertainty involved in predicting the value of a random variable. For example, specifying the outcome of a fair coin flip (two equally likely outcomes) provides less information (lower entropy) than specifying the outcome from a roll of a die (six equally likely outcomes).

See also

External links