1

Let $y_a$ be an observed value and $\hat{y}_a$ be predicted value.

Then I've read that

$$var(y_a- \hat{y}_a)=\sigma^(1+ x_a'(X'X)^{-1}x_a)$$

but what's the proof for this?


Additional info: $X_1=x_{a1}, ..., X_p = x_{ap}$
$x_a' = (1, x_{a1}, ..., x_{ap})$
$x_a' \hat{\beta} = \hat{y}_a$
$y_a \text{~} N(x_a' \beta, \sigma^2)$
$E(y_a - \hat{y}_a)=0$
$y_a$s id

mavavilj
  • 4,109

1 Answers1

1

It follows from bilinearity (linear in both arguments) of the covariance operator $\text{Cov}(\cdot,\cdot)$. You can prove everything just using this property, along with definitions.

Recall $\hat{y}_a = x_a'\hat{\beta} = x_a'(X'X)^{-1}X'y.$ First

\begin{align*} \text{Var}(\hat{y}_a) &= \text{Cov}(\hat{y}_a, \hat{y}_a) \\ &= \text{Cov}(x_a'(X'X)^{-1}X'y, x_a'(X'X)^{-1}X'y) \\ &= x_a'(X'X)^{-1}X'\left[ \sigma^2 I \right] X (X'X)^{-1}x_a \\ &= \sigma^2 x_a'(X'X)^{-1}x_a \end{align*}

Then \begin{align*} \text{Cov}(y_a, \hat{y}_a) &= \text{Cov}(y_a, x_a'(X'X)^{-1}X'y) \\ &= \text{Cov}(y_a, y)X(X'X)^{-1}x_a \\ &= 0 X(X'X)^{-1}x_a\\ \end{align*}

Putting it all together: \begin{align*} \text{Var}(y_a- \hat{y}_a) &= \text{Cov}(y_a- \hat{y}_a, y_a- \hat{y}_a) \\ &= \text{Var}(y) - 2\text{Cov}(y_a,\hat{y}_a) + \text{Var}(\hat{y}_a) \\ &= \sigma^2 - 2\times 0 + x_a'(X'X)^{-1}x_a \sigma^2 \\ &= \sigma^2(1+ x_a'(X'X)^{-1}x_a) \end{align*}

Taylor
  • 20,630