Difference between revisions of "Information theory"

From Wiki @ Karl Jones dot com
Jump to: navigation, search
Line 1: Line 1:
 
'''Information theory''' is a branch of [[applied mathematics]], [[electrical engineering]], and [[computer science]] involving the [[quantification of information]].
 
'''Information theory''' is a branch of [[applied mathematics]], [[electrical engineering]], and [[computer science]] involving the [[quantification of information]].
 +
 +
(TO DO: expand, organize, cross-reference, illustrate.)
  
 
== Description ==
 
== Description ==

Revision as of 11:44, 9 September 2015

Information theory is a branch of applied mathematics, electrical engineering, and computer science involving the quantification of information.

(TO DO: expand, organize, cross-reference, illustrate.)

Description

Information theory was developed by Claude E. Shannon to find fundamental limits on signal processing operations such as:

Applications

Since its inception it has broadened to find applications in many other areas, including:

Entropy

A key measure of information is entropy, which is usually expressed by the average number of bits needed to store or communicate one symbol in a message.

Entropy quantifies the uncertainty involved in predicting the value of a random variable. For example, specifying the outcome of a fair coin flip (two equally likely outcomes) provides less information (lower entropy) than specifying the outcome from a roll of a die (six equally likely outcomes).

See also

External links