Difference between revisions of "Estimation theory"
Karl Jones (Talk | contribs) (Created page with "'''Estimation theory''' is a branch of statistics that deals with estimating the values of parameters based on measured empirical data that has a random component. == Des...") |
(No difference)
|
Latest revision as of 11:41, 13 September 2016
Estimation theory is a branch of statistics that deals with estimating the values of parameters based on measured empirical data that has a random component.
Description
The parameters describe an underlying physical setting in such a way that their value affects the distribution of the measured data. An estimator attempts to approximate the unknown parameters using the measurements.
For example, it is desired to estimate the proportion of a population of voters who will vote for a particular candidate. That proportion is the parameter sought; the estimate is based on a small random sample of voters.
Or, for example, in radar the goal is to estimate the range of objects (airplanes, boats, etc.) by analyzing the two-way transit timing of received echoes of transmitted pulses. Since the reflected pulses are unavoidably embedded in electrical noise, their measured values are randomly distributed, so that the transit time must be estimated. In estimation theory, two approaches are generally considered.
The probabilistic approach (described in this article) assumes that the measured data is random with probability distribution dependent on the parameters of interest
The set-membership approach assumes that the measured data vector belongs to a set which depends on the parameter vector.
For example, in electrical communication theory, the measurements which contain information regarding the parameters of interest are often associated with a noisy signal. Without randomness, or noise, the problem would be deterministic and estimation would not be needed.
See also
- Best linear unbiased estimator (BLUE)
- Chebyshev center
- Completeness (statistics)
- Cramér–Rao bound
- Detection theory
- Efficiency (statistics)
- Estimator, Estimator bias
- Expectation-maximization algorithm (EM algorithm)
- Fermi problem
- Grey box model
- Information theory
- Kalman filter
- Least-squares spectral analysis
- Markov chain Monte Carlo (MCMC)
- Matched filter
- Maximum a posteriori (MAP)
- Maximum likelihood
- Maximum entropy spectral estimation
- Method of moments, generalized method of moments
- Minimum mean squared error (MMSE)
- Minimum variance unbiased estimator (MVUE)
- Nonlinear system identification
- Nuisance parameter
- Parametric equation
- Particle filter
- Rao–Blackwell theorem
- Spectral density, Spectral density estimation
- Statistics
- Statistical signal processing
- Sufficiency (statistics)
- Wiener filter
External links
- Estimation theory @ Wikipedia.org