It follows from bilinearity (linear in both arguments) of the covariance operator $\text{Cov}(\cdot,\cdot)$. You can prove everything just using this property, along with definitions.
Recall $\hat{y}_a = x_a'\hat{\beta} = x_a'(X'X)^{-1}X'y.$ First
\begin{align*}
\text{Var}(\hat{y}_a) &= \text{Cov}(\hat{y}_a, \hat{y}_a) \\
&= \text{Cov}(x_a'(X'X)^{-1}X'y, x_a'(X'X)^{-1}X'y) \\
&= x_a'(X'X)^{-1}X'\left[ \sigma^2 I \right] X (X'X)^{-1}x_a \\
&= \sigma^2 x_a'(X'X)^{-1}x_a
\end{align*}
Then
\begin{align*}
\text{Cov}(y_a, \hat{y}_a) &= \text{Cov}(y_a, x_a'(X'X)^{-1}X'y) \\
&= \text{Cov}(y_a, y)X(X'X)^{-1}x_a \\
&= 0 X(X'X)^{-1}x_a\\
\end{align*}
Putting it all together:
\begin{align*}
\text{Var}(y_a- \hat{y}_a) &= \text{Cov}(y_a- \hat{y}_a, y_a- \hat{y}_a) \\
&= \text{Var}(y) - 2\text{Cov}(y_a,\hat{y}_a) + \text{Var}(\hat{y}_a) \\
&= \sigma^2 - 2\times 0 + x_a'(X'X)^{-1}x_a \sigma^2 \\
&= \sigma^2(1+ x_a'(X'X)^{-1}x_a)
\end{align*}