Difference between revisions of "Algorithmic information theory"

From Wiki @ Karl Jones dot com
Jump to: navigation, search
(See also)
 
Line 6: Line 6:
  
 
* [[Algorithm]]
 
* [[Algorithm]]
 +
* [[Algorithmic probability]]
 +
* [[Algorithmically random sequence]]
 +
* [[Chaitin's constant]]
 +
* [[Chaitin–Kolmogorov randomness]]
 +
* [[Computationally indistinguishable]]
 
* [[Computer science]]
 
* [[Computer science]]
 +
* [[Distribution ensemble]]
 +
* [[Epistemology]]
 +
* [[Inductive inference]]
 +
* [[Inductive probability]]
 
* [[Information theory]]
 
* [[Information theory]]
 +
* [[Invariance theorem]]
 +
* [[Kolmogorov complexity]]
 +
* [[Limits of knowledge]]
 
* [[Mathematics]]
 
* [[Mathematics]]
 +
* [[Minimum description length]]
 +
* [[Minimum message length]]
 +
* [[Pseudorandom ensemble]]
 +
* [[Pseudorandom generator]]
 +
* [[Simplicity theory]]
 +
* [[Solomonoff's theory of inductive inference]]
 
* [[Statistical randomness]]
 
* [[Statistical randomness]]
 +
* [[Uniform ensemble]]
  
 
== External links ==
 
== External links ==

Latest revision as of 19:15, 13 May 2016

Algorithmic information theory is a subfield of information theory and computer science that concerns itself with the relationship between computation and information.

According to Gregory Chaitin, it is "the result of putting Shannon's information theory and Turing's computability theory into a cocktail shaker and shaking vigorously."

See also

External links