Minimum mean-square error
Encyclopedia
In statistics
Statistics
Statistics is the study of the collection, organization, analysis, and interpretation of data. It deals with all aspects of this, including the planning of data collection in terms of the design of surveys and experiments....

 and signal processing
Signal processing
Signal processing is an area of systems engineering, electrical engineering and applied mathematics that deals with operations on or analysis of signals, in either discrete or continuous time...

, a minimum mean square error (MMSE) estimator describes the approach which minimizes the mean square error (MSE), which is a common measure of estimator quality.

The term MMSE specifically refers to estimation in a Bayesian
Bayesian inference
In statistics, Bayesian inference is a method of statistical inference. It is often used in science and engineering to determine model parameters, make predictions about unknown variables, and to perform model selection...

 setting, since in the alternative frequentist setting there does not exist a single estimator having minimal MSE. A somewhat similar concept can be obtained within the frequentist point of view if one requires unbiasedness, since an estimator may exist that minimizes the variance (and hence the MSE) among unbiased estimators. Such an estimator is then called the minimum-variance unbiased estimator
Minimum-variance unbiased estimator
In statistics a uniformly minimum-variance unbiased estimator or minimum-variance unbiased estimator is an unbiased estimator that has lower variance than any other unbiased estimator for all possible values of the parameter.The question of determining the UMVUE, if one exists, for a particular...

 (MVUE).

Definition

Let be an unknown random variable, and let be a known random variable (the measurement). An estimator
Estimator
In statistics, an estimator is a rule for calculating an estimate of a given quantity based on observed data: thus the rule and its result are distinguished....

  is any function of the measurement , and its MSE
Mean squared error
In statistics, the mean squared error of an estimator is one of many ways to quantify the difference between values implied by a kernel density estimator and the true values of the quantity being estimated. MSE is a risk function, corresponding to the expected value of the squared error loss or...

 is given by
where the expectation is taken over both and .

The MMSE estimator is then defined as the estimator achieving minimal MSE.

In many cases, it is not possible to determine a closed form for the MMSE estimator. In these cases, one possibility is to seek the technique minimizing the MSE within a particular class, such as the class of linear estimators. The linear MMSE estimator is the estimator achieving minimum MSE among all estimators of the form . If the measurement is a random vector, is a matrix and is a vector. (Such an estimator would more correctly be termed an affine
Affine transformation
In geometry, an affine transformation or affine map or an affinity is a transformation which preserves straight lines. It is the most general class of transformations with this property...

 MMSE
estimator, but the term linear estimator is widely used.)

Properties

  • Under some weak regularity assumptions, the MMSE estimator is uniquely defined, and is given by
In other words, the MMSE estimator is the conditional expectation of given the observed value of the measurements.

  • If and are jointly Gaussian, then the MMSE estimator is linear, i.e., it has the form for constants and . As a consequence, to find the MMSE estimator, it is sufficient to find the linear MMSE estimator. Such a situation occurs in the example presented in the next section.

  • The orthogonality principle
    Orthogonality principle
    In statistics and signal processing, the orthogonality principle is a necessary and sufficient condition for the optimality of a Bayesian estimator. Loosely stated, the orthogonality principle says that the error vector of the optimal estimator is orthogonal to any possible estimator...

    : An estimator is MMSE if and only if
for all functions of the measurements. A different version of the orthogonality principle exists for linear MMSE estimators.

Example

An example can be shown by using a linear combination of random variable estimates and to estimate another random variable using If the random variables are real Gaussian random variables with zero mean and covariance matrix
Covariance matrix
In probability theory and statistics, a covariance matrix is a matrix whose element in the i, j position is the covariance between the i th and j th elements of a random vector...

 given by
we will estimate the vector and find coefficients such that the estimate is an optimal estimate of We will use the autocorrelation matrix, R, and the cross correlation matrix, C, to find vector A, which consists of the coefficient values that will minimize the estimate. The autocorrelation matrix is defined as
The cross correlation matrix is defined as

In order to find the optimal coefficients by the orthogonality principle
Orthogonality principle
In statistics and signal processing, the orthogonality principle is a necessary and sufficient condition for the optimality of a Bayesian estimator. Loosely stated, the orthogonality principle says that the error vector of the optimal estimator is orthogonal to any possible estimator...

 we solve the equation by inverting and multiplying to get

So we have and
as the optimal coefficients for Computing the minimum
mean square error then gives .

A shorter, non-numerical example can be found in orthogonality principle
Orthogonality principle
In statistics and signal processing, the orthogonality principle is a necessary and sufficient condition for the optimality of a Bayesian estimator. Loosely stated, the orthogonality principle says that the error vector of the optimal estimator is orthogonal to any possible estimator...

.

See also

  • Bayesian estimator
  • Mean squared error
    Mean squared error
    In statistics, the mean squared error of an estimator is one of many ways to quantify the difference between values implied by a kernel density estimator and the true values of the quantity being estimated. MSE is a risk function, corresponding to the expected value of the squared error loss or...

  • Minimum-variance unbiased estimator
    Minimum-variance unbiased estimator
    In statistics a uniformly minimum-variance unbiased estimator or minimum-variance unbiased estimator is an unbiased estimator that has lower variance than any other unbiased estimator for all possible values of the parameter.The question of determining the UMVUE, if one exists, for a particular...

     (MVUE)
  • Orthogonality principle
    Orthogonality principle
    In statistics and signal processing, the orthogonality principle is a necessary and sufficient condition for the optimality of a Bayesian estimator. Loosely stated, the orthogonality principle says that the error vector of the optimal estimator is orthogonal to any possible estimator...


Further reading

  • Johnson, D. (22 November 2004). Minimum Mean Squared Error Estimators. Connexions
  • Prediction and Improved Estimation in Linear Models, by J. Bibby, H. Toutenburg (Wiley, 1977). This book looks almost exclusively at minimum mean-square error estimation and inference.
  • Jaynes, E. T. Probability Theory: The Logic of Science. Cambridge University Press, 2003.
  • Moon, T.K. and W.C. Stirling. Mathematical Methods and Algorithms for Signal Processing. Prentice Hall. 2000.
The source of this article is wikipedia, the free encyclopedia.  The text of this article is licensed under the GFDL.
 
x
OK