1

I want to show that $\hat{\beta}$ is identical,no matter if I do a multivariate regression or the partialling out approach. Therefore consider the following partialling out approach (without constant!):

Step 1: $x_{ik} = x_{i2}' * α + v_{ik}$

Step 2: $y_i = v_{ik}' * β + u_{ik}$

My starting point is the following: $\hat{\beta} = \frac{\sum y_{i} v̂_{ik}}{\sum v̂_{ik}^2}$

W know from step 1 that $v̂_{ik} = x_{ik} - x_{i2}' * \hat{\alpha}$ and can plug that in.

We know as well $\hat{\alpha} = \frac{\sum x_{ik} x_{i2}}{\sum x_{i2}^2}$.

This means that $v̂_{ik} = x_{ik} - x_{i2}' * \frac{\sum x_{ik} x_{i2}}{\sum x_{i2}^2}$.

Plugging that in gives me:

$\hat{\beta} = \frac{\sum y_{i} ( x_{ik} - x_{i2}' * \frac{\sum x_{ik} x_{i2}}{\sum x_{i2}^2})}{\sum (x_{ik} - x_{i2}' * \frac{\sum x_{ik} x_{i2}}{\sum x_{i2}^2})^2}$

But now I am not sure how to carry on and whether at all this will lead to any reasonable solution. As a sidenote: I adapted this approach from proving TSLS in the context of IV, that's why it may look very similar. But I doubt, that I correctly adjusted in this situation with vectors. Hope for some suggestions how to carry on with this but also open to other approaches. Given are just step 1 and step 2, how you prove it completely up to you.

  • 1
    You can find a bunch of proofs at https://stats.stackexchange.com/questions/17336. Alternatively, just apply Gram-Schmidt orthogonalization. – whuber Oct 23 '23 at 12:06

0 Answers0