Convergence of random variables
In probability theory, there exist several different notions of convergence of random variables.
Description
The convergence of sequences of random variables to some limit random variable is an important concept in probability theory, and its applications to statistics and stochastic processes.
The same concepts are known in more general mathematics as stochastic convergence and they formalize the idea that a sequence of essentially random or unpredictable events can sometimes be expected to settle down into a behavior that is essentially unchanging when items far enough into the sequence are studied.
The different possible notions of convergence relate to how such a behavior can be characterized: two readily understood behaviors are that the sequence eventually takes a constant value, and that values in the sequence continue to change but can be described by an unchanging probability distribution.
See also
- Asymptotic distribution
- Big O in probability notation
- Continuous stochastic process - the question of continuity of a stochastic process is essentially a question of convergence, and many of the same concepts and relationships used above apply to the continuity question.
- Convergence of measures
- Limit of a sequence
- Proofs of convergence of random variables
- Skorokhod's representation theorem
- Tweedie distribution
External links
- Convergence of random variables @ Wikipedia