
Quadratic form (statistics)
Encyclopedia
If
is a vector
of
random variable
s, and
is an
-dimensional symmetric matrix, then the scalar
quantity

is known as a quadratic form in
.

where
and
are the expected value
and variance-covariance matrix
of
, respectively, and tr denotes the trace
of a matrix. This result only depends on the existence of
and
; in particular, normality of
is not required.
Since the quadratic form is a scalar quantity
. Note that both
and
are linear operators, so
. It follows that
and that, by the cyclic property of the trace operator,
. However, if
does follow a multivariate normal distribution, the variance of the quadratic form becomes particularly tractable. Assume for the moment that
is a symmetric matrix. Then,

In fact, this can be generalized to find the covariance
between two quadratic forms on the same
(once again,
and
must both be symmetric):

to be symmetric. The case for general
can be derived by noting that

so

But this is a quadratic form in the symmetric matrix
, so the mean and variance expressions are the same, provided
is replaced by
therein.
and an operator matrix
, then the residual sum of squares
can be written as a quadratic form in
:

For procedures where the matrix
is symmetric and idempotent
, and the errors
are Gaussian with covariance matrix
,
has a chi-squared distribution with
degrees of freedom and noncentrality parameter
, where


may be found by matching the first two central moment
s of a noncentral chi-squared random variable to the expressions given in the first two sections. If
estimates
with no bias
, then the noncentrality
is zero and
follows a central chi-squared distribution.

Vector space
A vector space is a mathematical structure formed by a collection of vectors: objects that may be added together and multiplied by numbers, called scalars in this context. Scalars are often taken to be real numbers, but one may also consider vector spaces with scalar multiplication by complex...
of

Random variable
In probability and statistics, a random variable or stochastic variable is, roughly speaking, a variable whose value results from a measurement on some type of random process. Formally, it is a function from a probability space, typically to the real numbers, which is measurable functionmeasurable...
s, and


Scalar (mathematics)
In linear algebra, real numbers are called scalars and relate to vectors in a vector space through the operation of scalar multiplication, in which a vector can be multiplied by a number to produce another vector....
quantity

is known as a quadratic form in

Expectation
It can be shown that
where


Expected value
In probability theory, the expected value of a random variable is the weighted average of all possible values that this random variable can take on...
and variance-covariance matrix
Covariance matrix
In probability theory and statistics, a covariance matrix is a matrix whose element in the i, j position is the covariance between the i th and j th elements of a random vector...
of

Trace (linear algebra)
In linear algebra, the trace of an n-by-n square matrix A is defined to be the sum of the elements on the main diagonal of A, i.e.,...
of a matrix. This result only depends on the existence of



Derivation
Since the quadratic form is a scalar quantity





and that, by the cyclic property of the trace operator,

Variance
In general, the variance of a quadratic form depends greatly on the distribution of



In fact, this can be generalized to find the covariance
Covariance
In probability theory and statistics, covariance is a measure of how much two variables change together. Variance is a special case of the covariance when the two variables are identical.- Definition :...
between two quadratic forms on the same




Computing the variance in the non-symmetric case
Some texts incorrectly state that the above variance or covariance results hold without requiring


so

But this is a quadratic form in the symmetric matrix



Examples of quadratic forms
In the setting where one has a set of observations

Residual sum of squares
In statistics, the residual sum of squares is the sum of squares of residuals. It is also known as the sum of squared residuals or the sum of squared errors of prediction . It is a measure of the discrepancy between the data and an estimation model...
can be written as a quadratic form in


For procedures where the matrix

Idempotent matrix
In algebra, an idempotent matrix is a matrix which, when multiplied by itself, yields itself. That is, the matrix M is idempotent if and only if MM = M...
, and the errors
Errors and residuals in statistics
In statistics and optimization, statistical errors and residuals are two closely related and easily confused measures of the deviation of a sample from its "theoretical value"...
are Gaussian with covariance matrix






may be found by matching the first two central moment
Central moment
In probability theory and statistics, central moments form one set of values by which the properties of a probability distribution can be usefully characterised...
s of a noncentral chi-squared random variable to the expressions given in the first two sections. If


Bias of an estimator
In statistics, bias of an estimator is the difference between this estimator's expected value and the true value of the parameter being estimated. An estimator or decision rule with zero bias is called unbiased. Otherwise the estimator is said to be biased.In ordinary English, the term bias is...
, then the noncentrality

