Markov chain

From Wiki @ Karl Jones dot com
Jump to: navigation, search

In probability theory and statistics, a Markov chain or Markoff chain, named after the Russian mathematician Andrey Markov, is a stochastic process that satisfies the Markov property (usually characterized as "memorylessness").

Description

Loosely speaking, a process satisfies the Markov property if one can make predictions for the future of the process based solely on its present state just as well as one could knowing the process's full history. i.e., conditional on the present state of the system, its future and past are independent.

In discrete time, the process is known as a discrete-time Markov chain (DTMC). It undergoes transitions from one state to another on a state space, with the probability distribution of the next state depending only on the current state and not on the sequence of events that preceded it.

In continuous time, the process is known as a Continuous-time Markov chain (CTMC or continuous-time Markov process). It takes values in some finite state space and the time spent in each state takes non-negative real values and has an exponential distribution. Future behavior of the model (both remaining time in current state and next state) depends only on the current state of the model and not on historical behavior.

Markov chains have many applications as statistical models of real-world processes.

See also

External links