7

I have an equation of the form(all vectors): $y=X_1\beta_1+X_2\beta_2+u$.

I'm interested in knowing if the beta OLS estimators and respective residual for this equation are the same as for when we apply OLS to the following equations:

  1. $P_{X_1}y=P_{X_1}X_2\beta_2+v$
  2. $P_Xy=X_1\beta_1+X_2\beta_2+v$,

where the $P_Z$ are the usual definition of projection matrices, using $Z$.

So, I've tried using the FWL theorem, and I've got respectively:

  1. $\hat\beta_2 = (X_2' P_{X_1}X_2)^{-1}X_2'P_{X_1}y$, and $\hat v = (I- P_{X_1}X_2(X_2'P_{X_1}X_2)^{-1}X_2'P_{X_1})P_{X_1}y$. I was wondering if I miscalculated $\hat u$ since looking at equation 1, since both $y$ and $X_2\beta_2$ are projected in to the space spanned by columns of $X_1$, the residuals would be zero.
  2. $\hat\beta_2 = (X_2' M_{X_1}X_2)^{-1}X_2'M_{X_1}P_X y$, and $\hat v = (I- M_{X_1}X_2(X_2'M_{X_1}X_2)^{-1}X_2'M_{X_1})P_{X}y$. However, I do not see how the estimate for $\beta_2$ is equal in both cases, since if you notice that applying OLS to equation 2, we get $\hat \beta=(X'X)^{-1}X'P_X y=(X'X)^{-1}X'y$.

Any help would be appreciated.

Edit1: well, I found out how to do the 2nd point. We have to notice that $M_{X_1}P_X=(I-P_{X_1})P_X=P_X-P_{X_1}=P_X'-P_{X_1}'=(M_{X_1}P_X)'=P_X'M_{X_1}'=P_X M_{X_1}$ and that $ X_2'P_X=(P_X X_2)=X_2'$. As to the 1st point I have no idea...

An old man in the sea.
  • 5,111
  • 2
  • 23
  • 48

2 Answers2

8

What we know from FWL theorem, is that the regression

$$M_1y = M_1X_2\beta_2 + M_1u \tag{1}$$

will give the same estimates for $\beta_2$ as the full regression

$$y = X_1\beta_1 +X_2\beta_2 + u \tag{2}$$

where

$$M_1 = I - P_1 = I - X_1(X_1'X_1)^{-1}X_1'$$

is the so-called annihilator or residual-maker matrix. The estimator from $(1)$ is

$$\hat \beta_2 = (X_2'M_1X_2)^{-1}X_2'M_1y \tag{3}$$

So it boils down to examine whether the estimator from the specification

$$P_1y = P_1X_2\beta_2 + w \tag{4}$$

which is

$$\tilde \beta_2 = (X_2'P_1X_2)^{-1}X_2'P_1y \tag{5} $$

will be the same as $\hat \beta_2$.

Well,

$$(2),(3) \implies \hat \beta_2 - \beta_2 = (X_2'M_1X_2)^{-1}M_1u \tag{6}$$

while

$$ (2), (5) \implies \tilde \beta_2 -\beta_2 = (X_2'P_1X_2)^{-1}X_2'X_1\beta_1+ (X_2'P_1X_2)^{-1}X_2'P_1u \tag{7}$$

Given that $(6)$ and $(7)$ involve arbitrary exogenous quantities ($u, \beta_1$) I don't see how they could be equal, except by zero-probability chance.

Even if in our sample $X_1$ and $X_2$ are orthogonal (which would eliminate the first term in $(7)$ but which, with observational data, is a joke to even mention), then the two would be unbiased under strict exogeneity -but this is as far as similarities appear to go here.

Alecos Papadopoulos
  • 33,814
  • 1
  • 48
  • 116
  • Alecos, thanks for your answer. I think you're just stating the FWL theorem. What I was trying to do was to use the FWL theorem and check if I could obtain some conclusions out of its usage... – An old man in the sea. Feb 27 '16 at 16:30
  • In the first equation, those results are not from FWL. They're just the usual OLS. I did use FWL on the second equation and on the initial eq. too, though. – An old man in the sea. Feb 27 '16 at 20:09
0
IF y = (x1)(β1), we can work out (β1) directly.
I believe FWL is thinking of that way, so it changes to use residuals;
Residuals ε of x1 on x2 and ε of y on x2 are both linear independent to x2.

enter image description here

Mou
  • 1
  • 2