FWL theorem
Encyclopedia
In econometrics
, the Frisch–Waugh–Lovell (FWL) theorem is named after the econometricians Ragnar Frisch, Frederick V. Waugh, and Michael C. Lovell.
The Frisch–Waugh–Lovell theorem states that if the regression
we are concerned with is:
where and are and respectively and where and are conformable
, then the estimate of will be the same as the estimate of it from a modified regression of the form:
where projects onto the orthogonal complement of the image of the projection matrix . Equivalently, MX1 projects onto the orthogonal complement of the column space of X1. Specifically,
This result implies that all these secondary regressions are unnecessary: using projection matrices to make the variables orthogonal to each other will lead to the same results as running the regression with all non-orthogonal included.
Econometrics
Econometrics has been defined as "the application of mathematics and statistical methods to economic data" and described as the branch of economics "that aims to give empirical content to economic relations." More precisely, it is "the quantitative analysis of actual economic phenomena based on...
, the Frisch–Waugh–Lovell (FWL) theorem is named after the econometricians Ragnar Frisch, Frederick V. Waugh, and Michael C. Lovell.
The Frisch–Waugh–Lovell theorem states that if the regression
Linear regression
In statistics, linear regression is an approach to modeling the relationship between a scalar variable y and one or more explanatory variables denoted X. The case of one explanatory variable is called simple regression...
we are concerned with is:
where and are and respectively and where and are conformable
Conformable matrix
In mathematics, a matrix is conformable if its dimensions are suitable for defining some operation .-Examples:...
, then the estimate of will be the same as the estimate of it from a modified regression of the form:
where projects onto the orthogonal complement of the image of the projection matrix . Equivalently, MX1 projects onto the orthogonal complement of the column space of X1. Specifically,
This result implies that all these secondary regressions are unnecessary: using projection matrices to make the variables orthogonal to each other will lead to the same results as running the regression with all non-orthogonal included.