Frisch–Waugh–Lovell theorem
In econometrics, the Frisch–Waugh–Lovell[a] (FWL) theorem proves a property of ordinary least squares estimators. The theorem states that, in a least squares-estimated regression, each independent variable's coefficient reflects the relationship between the dependent variable and the part of that independent variable which is not linearly explained by the other covariates. By relating multiple regression coefficients to simple regression coefficients, the theorem forms the basis for interpreting coefficients in multiple regressions. The theorem is named for econometricians Ragnar Frisch, Frederick V. Waugh, and Michael C. Lovell.
Background
[edit]The Frisch-Waugh-Lovell theorem is a result for regressions estimated by ordinary least squares, the most commonly used estimator in applied econometrics.[1] Ordinary least squares is a method of estimating coefficients in regressions which are linear in parameters: a single dependent variable is modeled as a linear combination of one or more independent variables plus some error term. For example, wages may be modeled as a function of a constant term, education, and parental income, with an error term that encompasses deviations from the model's prediction. Least squares is one way of estimating the coefficients of such a model, setting the coefficients to minimize the sum of squared errors.[2] Under a certain set of assumptions, the hypotheses of the Gauss–Markov theorem, least squares estimation is the best linear unbiased estimator.[3]
Let be any dependent variable and a set of independent variables, and suppose observations of are obtained. If is modeled as a linear function of the independent variables and a constant, it can be written as . The least squares estimator sets the coefficients to minimize the sum of squared errors . With observations this involves minimizing across equations, and is typically written in matrix form as , where and are vectors of dependent variable observations and errors, respectively, and is an -by- matrix of independent variables' observations. Then, the least squares solution is .[4]
In regressions estimated by least squares, it is common to refer to an independent variable's coefficient as the effect of that variable "holding constant" the other independent variables.[5] For example, if wage is modeled as a function of education and work experience, the coefficient on education is interpreted as the difference in the expectation of wage for a unit difference in education, "holding constant" work experience. Econometrician Arthur Goldberger frames the Frisch-Waugh-Lovell theorem as "giving content to th[is] language".[6]
Definition and interpretation
[edit]The Frisch-Waugh-Lovell theorem states that in a least squares-estimated regression of the form
any coefficient can be obtained by the two-step process of:
- Regress on the set of other independent variables, obtaining residuals
- Regress on , obtaining
This two-step process is referred to as the residual regression or equivalently the regression anatomy theorem.[7][8] This result is a numerical property of least squares estimation and does not depend on statistical properties of the data.[9][10]
From this theorem, each independent variable can be decomposed into two parts: the part which is linearly related to the set of other independent variables, and the residual which remains. Then, that independent variable's coefficient can be found from the simple regression of the dependent variable on those residuals.[11][12] This result is the basis for interpreting the impact of including additional variables in a regression: it is equivalent to removing from the existing variables the part which the new variables linearly explain.[13][14]
The Frisch-Waugh-Lovell theorem can, for example, be applied to interpret multicollinearity. When most of the variation in an independent variable is explained by the other independent variables, very little variation remains after the first step of the residual regression. Resultingly, the estimate of the independent variable's coefficient may be less precise than if fewer variables were controlled for. This is because least squares, in equaling the regression of the dependent variable on the part of each independent variable not linearly related to the other independent variables, is implicitly removing the variation in each independent variable explained by the other independent variables.[15][16]
Example
[edit]Consider the regression of wage on education and parental income:
While the least squares estimates for and can be obtained by minimizing directly, each can be equivalently obtained by the two-step Frisch-Waugh-Lovell process. In the case of education:
- Regress education on parental income, saving the residuals from this regression: the part of education not linearly related to parental income
- Regress wages on the residuals, obtaining the least squares estimate for
This illustrates how reflects the effect of education on wages controlling for parental income: it is the relationship between wages and the part of education not linearly related to parental income.[8]
Double residual regression
[edit]The double residual regression is the three-step process:
- Regress on the set of other independent variables, obtaining residuals
- Regress on the set of independent variables excluding , obtaining residuals
- Regress on , estimating and
Like the two-step process, this yields an identical coefficient to the full regression.[6][17] It includes the additional feature that the residuals from the regression in step 3 equal the residuals in the full regression.[11]
Multivariate definition
[edit]Consider the least squares-estimated regression , where and are vectors of dependent variable observations and errors, respectively, is an matrix of independent variables' observations, is an matrix of independent variables' observations, and and are coefficient vectors for and respectively. Denote the ordinary least squares estimate of as . Then, the Frisch-Waugh-Lovell theorem states that
where , the residuals from the regression of on , and , the residuals from the regression of on . The first expression of is the residual regression estimate, while the second is the double residual regression estimate.[6]
Geometric interpretation
[edit]With a linear regression of the form , the fitted values can be interpreted as the orthogonal projection of onto the column space of , .[18] The Frisch-Waugh-Lovell theorem is then (in the double residual regression case) the three step process:
- Project onto the orthogonal complement of , obtaining residual vector
- Project onto the orthogonal complement of , obtaining residual vector
- Project onto , obtaining projection and residuals
The resulting and residuals are identical to those in the full regression of on and .[19][20]
Proof
[edit]Consider the linear regression and annihilator matrix . Premultiplying both sides of the regression equation by the annihilator matrix removes from and the part linearly explained by :
Then, by the least squares result, and . This concludes the proof.[6][21]
History
[edit]Yule (1907)
[edit]In 1907, statistician Udny Yule introduced a new system of notation for, and derived a number of algebraic results of, least squares-estimated regression coefficients. Among his results was an early form of the Frisch-Waugh-Lovell theorem.[22][23] In Yule's notation, where represents the residuals from the regression of on through , and the residuals from the regression of on through , he finds that the regression of on yields the coefficient for in the full regression of on through . He notes that this relationship holds "quite generally and without reference to the form of the [variables'] frequency distribution." [24] With this result, Yule defines the multiple regression coefficient – the coefficient on in the regression of on through – as the simple regression of on :
Having related simple regression coefficients to multiple regression coefficients, Yule describes his result as filling a gap in the interpretation of least squares coefficients and partial correlations by showing that they reflect "an actual correlation between determinate variables."[24][25]
Using Yule's notation, in a 1968 text econometrician Arthur Goldberger states the residual regression form of the Frisch-Waugh-Lovell theorem as and the double residual regression form as .[26]
Frisch and Waugh (1933)
[edit]In the early 20th century, there was debate among economists over the correct approach to adjusting time series data used in regressions for the influence of trends. The two primary methods in question were the direct de-trending of each time series and the inclusion of a time trend in the regression. In 1933 and using the notation introduced by Yule, a paper in the first volume of Econometrica by econometricians Ragnar Frisch and Frederick V. Waugh proved the equivalence between the two methods.[27][28][29]
Prior to Frisch and Waugh's result, much of the debate around the optimal time trend adjustment concerned estimates of static demand equations whose observations had been taken over time, a factor which economists sought to adjust for, in order to bring the statistical model closer in line with theory. Advocates of including time trends in regressions argued it improved the model's fit, where opponents argued that time trends may violate ceteris paribus assumptions of the underlying theoretical model.[30] In proving the equivalence of the two methods, addressing the difference in model fit, and formalizing the distinction between estimated coefficients and theoretical models, Frisch and Waugh's paper resolved the debate around trend adjustments.[31]
Economist and historian Mary S. Morgan contextualizes Frisch and Waugh's result, as it pertains to a greater understanding of regression coefficients, as having "paved the way for a more generous use of the other factors in the demand equation."[32] Frisch and Waugh's results were, in 1952, extended by Gerhard Tintner to polynomial trend adjustment.[33][34] In a 1953 textbook on demand analysis, econometrician Herman Wold references Frisch and Waugh's paper as a special case applied to time adjustments.[35]
Generalization and later development
[edit]In 1963, econometrician Michael C. Lovell extended Frisch and Waugh's results and provided a general proof of the theorem in matrix notation.[22][19][36] Rather than focusing on certain types of variables, as Frisch and Waugh did with time trends, Lovell proves the result with arbitrary sets of independent variables.[37] Lovell presents 7 regression specifications and proves how their coefficients relate, among them both the residual and double residual forms of the theorem.[38] Lovell published an additional proof in 2008 using only simple algebra.[37]
In 1964, economist Richard Stone published a generalized proof of the theorem.[39]
The Frisch-Waugh-Lovell theorem is included in most intermediate to advanced econometrics textbooks.[19]
Naming
[edit]The theorem has been referred to under a number of names, including the Frisch-Waugh-Lovell theorem, Frisch-Waugh theorem, partitioned regression theorem, residual regression, and the regression anatomy theorem.[19][15]
While Frisch and Waugh's paper was not the first introduction of the result, it was the first proof in econometrics.[25][19] Recognizing their proof and the generalization by Lovell, the theorem was presented as the Frisch-Waugh-Lovell theorem in a 1993 econometrics textbook by Russell Davidson and James G. MacKinnon.[19]
Extensions
[edit]Where the Frisch-Waugh-Lovell theorem states that the full and residual regressions have the same coefficients, relationships between the coefficients' standard errors can also be shown. Lovell's 1963 paper finds that the homoskedastic standard errors of coefficients in the double residual regression differ from those of the full regression by a degrees of freedom adjustment.[40] In 2021, statistician Peng Ding presented a proof of Lovell's results and found comparable results for other estimates of standard errors, including heteroskedasticity-consistent and clustered standard errors.[41]
Analogues to the Frisch-Waugh-Lovell theorem have been shown for a number of other estimators, including generalized least squares,[42] ridge regression and the LASSO,[43] and k-class estimators, including limited information maximum likelihood.[44]
See also
[edit]Notes
[edit]- ^ Pronounced /ˈfriʃˌwɔːˌlʌvəl/.
References
[edit]Sources
[edit]- ^ Hansen 2022, p. 13.
- ^ Hansen 2022, p. 65.
- ^ Hansen 2022, p. 105.
- ^ Hansen 2022, p. 73.
- ^ Hansen 2022, p. 30.
- ^ a b c d Goldberger 1991, p. 186.
- ^ Hansen 2022, p. 81.
- ^ a b Goldberger 1991, p. 185.
- ^ Yule 1907, p. 184.
- ^ Filoso 2013, p. 93.
- ^ a b Hansen 2022, p. 82.
- ^ Goldberger 1991, p. 185-186.
- ^ Mosteller & Tukey 1977, p. 331.
- ^ Ruud 2000, p. 58.
- ^ a b Filoso 2013.
- ^ Goldberger 1991, p. 245-246.
- ^ Goldberger 1968, p. 30.
- ^ Wold 1953, p. 189.
- ^ a b c d e f Sosa Escudero 2001.
- ^ Ruud 2000, p. 57-64.
- ^ Lovell 1963, p. 1004.
- ^ a b Aldrich 1998, p. 78.
- ^ Agresti 2015, p. 60.
- ^ a b Yule 1907, p. 183-184.
- ^ a b Aldrich 1998, p. 69-70.
- ^ Goldberger 1968, p. 35-37.
- ^ Aldrich 1998, p. 67.
- ^ Morgan 1996, p. 149-151.
- ^ Frisch & Waugh 1933.
- ^ Morgan 1996, p. 142-150.
- ^ Morgan 1996, pp. 150–152.
- ^ Morgan 1996, p. 151.
- ^ Tintner 1952, p. 304-306.
- ^ Lovell 1963, p. 1000.
- ^ Wold 1953, p. 245.
- ^ Lovell 1963.
- ^ a b Lovell 2008.
- ^ Lovell 1963, p. 1001.
- ^ Stone 1970, p. 73-74.
- ^ Lovell 1963, p. 1002-1003.
- ^ Ding 2021.
- ^ Fiebig & Bartels 1996.
- ^ Yamada 2017.
- ^ Basu 2024.
Journal articles
[edit]- Yule, George Udny (1907-05-14). "On the theory of correlation for any number of variables, treated by a new system of notation". Proceedings of the Royal Society of London. Series A, Containing Papers of a Mathematical and Physical Character. 79 (529): 182–193. doi:10.1098/rspa.1907.0028. ISSN 0950-1207.
- Frisch, Ragnar; Waugh, Frederick V. (1933). "Partial Time Regressions as Compared with Individual Trends". Econometrica. 1 (4): 387–401. doi:10.2307/1907330. ISSN 0012-9682.
- Lovell, Michael C. (December 1963). "Seasonal Adjustment of Economic Time Series and Multiple Regression Analysis". Journal of the American Statistical Association. 58 (304): 993–1010. doi:10.1080/01621459.1963.10480682. ISSN 0162-1459.
- Lovell, Michael C. (January 2008). "A Simple Proof of the FWL Theorem". The Journal of Economic Education. 39 (1): 88–91. doi:10.3200/JECE.39.1.88-91. ISSN 0022-0485.
- Aldrich, John (1998). "Doing Least Squares: Perspectives from Gauss and Yule". International Statistical Review / Revue Internationale de Statistique. 66 (1): 61–81. doi:10.2307/1403657. ISSN 0306-7734.
- Sosa Escudero, Walter (March 2001). "A geometric representation of the Frisch-Waugh-Lovell theorem". Documentos de Trabajo. 29. ISSN 1853-3930.
- Filoso, Valerio (March 2013). "Regression Anatomy, Revealed". The Stata Journal: Promoting communications on statistics and Stata. 13 (1): 92–106. doi:10.1177/1536867X1301300107. ISSN 1536-867X.
- Ding, Peng (2021-01-01). "The Frisch–Waugh–Lovell theorem for standard errors". Statistics & Probability Letters. 168 108945. doi:10.1016/j.spl.2020.108945. ISSN 0167-7152.
- Basu, Deepankar (2024-10-01). "Frisch–Waugh–Lovell theorem-type results for the k-Class and 2SGMM estimators". Statistics & Probability Letters. 213 110188. doi:10.1016/j.spl.2024.110188. ISSN 0167-7152.
- Fiebig, Denzil G.; Bartels, Robert (January 1996). "The frisch-waugh theorem and generalized least squares". Econometric Reviews. 15 (4): 431–443. doi:10.1080/07474939608800365. ISSN 0747-4938.
- Yamada, Hiroshi (2017-11-02). "The Frisch–Waugh–Lovell theorem for the lasso and the ridge regression". Communications in Statistics - Theory and Methods. 46 (21): 10897–10902. doi:10.1080/03610926.2016.1252403. ISSN 0361-0926.
Books
[edit]- Agresti, Alan (2015). Foundations of linear and generalized linear models. Hoboken, New Jersey: John Wiley & Sons Inc. ISBN 978-1-118-73005-8.
- Goldberger, Arthur Stanley (1991). A Course in Econometrics. Cambridge, Mass.: Harvard Univ. Press. ISBN 978-0-674-17544-0.
- Goldberger, Arthur Stanley (1968). Topics in Regression Analysis. New York: MacMillan. LCCN 68-15265.
- Hansen, Bruce E. (2022). Econometrics. Princeton: Princeton University Press. ISBN 978-0-691-23615-5.
- Morgan, Mary S. (1996). The History of Econometric Ideas. Historical Perspectives on Modern Economics (Reprinted ed.). Cambridge: Cambridge Univ. Press. ISBN 978-0-521-37398-2.
- Mosteller, Frederick; Tukey, John W. (1977). Data Analysis and Regression a Second Course in Statistics. Addison-Wesley. ISBN 0-201-04854-X.
- Stone, Richard (1970). "A generalization of the theorem of Frisch and Waugh". Mathematical Models of the Economy and Other Essays. Chapman and Hall. pp. 73–74. ISBN 0-412-10030-4.
- Ruud, Paul Arthur (2000). An Introduction to Classical Econometric Theory. New York: Oxford University Press. ISBN 978-0-19-511164-4.
- Tintner, Gerhard (1952). Econometrics. New York: John Wiley & Sons. LCCN 51-13006.
- Wold, Herman (1953). Demand Analysis: A study in econometrics. John Wiley & Sons.