Hi i am stuck on this one, the question is related to Gauss-Markov theorem:
Consider a general alternative to the OLS estimator that is also a linear unbiased estimator, say ${\tilde \beta}$. Outline a proof that the OLS estimator b is better, in a well defined sense, than ${\tilde \beta}$.
For both estimators b and ${\tilde \beta}$ i have to use the fact that $E(UU') = \sigma^2I$.
I have shown already that $ b= \left(\mathbf {X}'\mathbf {X}\right)^{-1}\mathbf {X}'\mathbf {y} = \left(\mathbf {X'}\mathbf {X}\right)^{-1}\mathbf X'(\mathbf {X}\mathbf {\beta}+u)$
$=\left(\mathbf {X'}\mathbf {X}\right)^{-1}\mathbf X'\mathbf {X}\mathbf \beta+(X'X)^{-1}X'u$
$=\beta+(X'X)^{-1}X'u$
Theorem proved for linearity, now
$E(b)= E[ \beta+(X'X)^{-1}X'u]=\beta $ as $E[u]=0$
Theorem proved for Unbiasedness.
Variance-covariance matrix of b: $Var(b)=E(b-\beta)(b-\beta)'$ using again $E(UU') = \sigma^2I$
$=E(b-\beta)(b-\beta)'=\sigma^2(X'X)^{-1}$
in order to show that b is a better estimator that ${\tilde \beta}$ i need to follow the same reasoning still using the fact that $E(UU') = \sigma^2I$ and then conclude that b has the minimum variance of all estimators thus it is the best, that is
$Var(b)\leq Var(\tilde \beta)$ i guess.
I am not sure how to properly compare the two estimators so if someone could give me a hint it would be much appreciated.
self-study. See the tag wiki – Glen_b Feb 26 '15 at 20:55