In general, I don’t think students should make a clear separation between simple regression vs. multiple regression because the underlying idea of minimising the Sum of Squared Residuals (see Estimation I) and the procedure of taking the partial derivative against $x_j$ to estimate $\hat \beta_j$ is the same.

However, there’s a fundamental theory in Multiple Regression that all econometrics should grasp to help them understand subsequent topics in Econometrics. And that is the Frisch-Waugh-Lovell (FWL) Theorem.

( I don’t want to confuse you, but at the end I will add a note showing you that simple regression is indeed a multiple regression. That is not a core concept to grasp, but it showcases that simple and multiple regression are really the same thing)

Before we dive into the FWL Theorem, you should have a firm understanding of residuals introduces in the *Inference I.*


What the FWL theorem says

In a multiple regression model

$$ y = \beta_0 + \beta_1 x_1 + \beta_2 x_2 + ... + \beta_k x_k + \varepsilon $$

the coefficient, $\hat \beta_1$, on $x_1$ is exactly the same as you would get if you:

  1. regress $y$ on $a_0 + a_2 x_2 + ... + a_k x_k$ and get the residuals $\tilde y$ (same as $\hat \varepsilon$ in normal ols)
  2. regress $x$ on $b_0 + b_2 x_2 + ... + b_k x_k$ and get th residuals $\tilde x$
  3. regress $\tilde y$ on $\tilde x$

The resulting $\hat \beta$ and its t-statistic are identical to the full OLS with $y$ on $\beta_0 + \beta_1 x_1 + \beta_2 x_2 + ... + \beta_k x_k$

To put FWL in pure math symbols:

Model: $y = \beta^{OLS} X + \gamma W + \varepsilon$

Step 1: $y = a W + e_y$ $\tilde y := \tilde e_y$

Step 2: $x = b W + e_x$ $\tilde x := \tilde e_x$

Step 3: $\tilde y = \beta^{FWL} \tilde x + \epsilon$

FWL Claim: $\hat \beta^{FWL} = \hat \beta^{ols}$

where


Direct implication of FWL

With FWL we can rewrite $\hat \beta$ as:

$$ \hat \beta^{FWL} = \frac{\sum(\tilde x - \bar{\tilde x})(\tilde y - \bar{\tilde y})}{\sum(\tilde x - \bar{\tilde x})^2} = \frac{\sum(x - \bar{x})( y - \bar{ y})}{\sum( x - \bar{ x})^2} = \hat \beta^{OLS} $$