Algorithmically random sequence
Intuitively, an algorithmically random sequence (or random sequence) is an infinite sequence of binary digits that appears random to any algorithm.
Description
The notion can be applied analogously to sequences on any finite alphabet (e.g. decimal digits). Random sequences are key objects of study in algorithmic information theory.
As different types of algorithms are sometimes considered, ranging from algorithms with specific bounds on their running time to algorithms which may ask questions of an oracle, there are different notions of randomness.
The most common of these is known as Martin-Löf randomness (or 1-randomness), but stronger and weaker forms of randomness also exist.
The term "random" used to refer to a sequence without clarification is usually taken to mean "Martin-Löf random" (defined below).
Because infinite sequences of binary digits can be identified with real numbers in the unit interval, random binary sequences are often called random real numbers.
Additionally, infinite binary sequences correspond to characteristic functions of sets of natural numbers; therefore those sequences might be seen as sets of natural numbers.
The class of all Martin-Löf random (binary) sequences is denoted by RAND or MLR.