3

Let $b$ denote the OLS estimator, and assuming normality of the errors we get $b \sim \mathop{\mathcal N}\left(\beta, \sigma^2\left(X'X\right)^{-1}\right)$, where $X\in\mathbb R^{N\times K}$ is of full column rank $K$. Now, for $R\in\mathbb R^{J\times K}$ with $J\leq K$ it must be$$Rb\sim \mathop{\mathcal N}\left(R\beta,\sigma^2R\left(X'X\right)^{-1}R'\right)$$which then implies $\left(Rb-R\beta\right)'\left[\sigma^2 R\left(X'X\right)^{-1}R'\right]^{-1}\left(R\beta-R\beta\right)\sim\chi^2_J$.

My question is, how do we know that $R\left(X'X\right)^{-1}R'$ is invertible? $R$ is assumed to be full row rank $J$, and $X'X$ is a symmetric invertible matrix, but I don't see how this implies that the $J\times J$ product is of full rank/invertible. I have seen the derivation from different sources including SE and nobody's mentioned this.

statmerkur
  • 5,950

2 Answers2

4

Lemma $1.$ If $\mathbf A$ is an $n\times n$ p. d. matrix and $\mathbf C$ is $p\times n$ matrix of rank $p,$ then $\mathbf C\mathbf A\mathbf C^\top$ is p. d.

The proof is straight forward by noticing $\mathbf x^\top\mathbf C\mathbf A\mathbf C^\top\mathbf x\geq 0$ and

$$\mathbf x^\top\mathbf C\mathbf A\mathbf C^\top\mathbf x= 0\iff\mathbf C^\top\mathbf x =\mathbf 0 \iff \mathbf x= \mathbf 0.\tag 1$$

$\blacksquare$

Observation $1.$ A p. d. matrix is nonsingular.


Reference:

$\rm[ I]$ Linear Regression Analysis, George A. F. Seber, Alan J. Lee, John Wiley & Sons, $2003, $ sec. $\text{A}.4, $ p. $461.$

statmerkur
  • 5,950
User1865345
  • 8,202
  • (+1) I don't know how I managed to miss the obvious. – statmerkur Nov 28 '22 at 09:01
  • 1
    Thank you, I didn't think of connecting it to p.d.; Had a look at the reference, and then we can even use that if $A$ is p.d. then the rank of $CAC'$ is equal to the rank of $C$. In this case the product is full rank $J$. But they are all intertwined results. – Zugzwang14 Nov 28 '22 at 13:17
3

We can write the real positive-definite matrix $\left(X^\top X\right)^{-1}$ as its eigendecomposition $\left(X^\top X\right)^{-1} = Q\Lambda Q^\top$. With $\Lambda = \Lambda^{1/2}\left(\Lambda^{1/2}\right)^\top$ and $Y\mathrel{:=}\left(\Lambda^{1/2}\right)^\top Q^\top R^\top$ we have $R\left(X^\top X\right)^{-1}R^\top = Y^\top Y$, which is positive-definite (and hence invertible) iff $Y$ has full column rank. But since $\left(\Lambda^{1/2}\right)^\top Q^\top$ is invertible, the column rank of $Y$ is the column rank of $R^\top$, which is the row rank of $R$.

statmerkur
  • 5,950
  • 1
    To be fair, I indeed thought along this line. Each approach is correct. Just the interplanetary of positive definiteness and full rank. – User1865345 Nov 28 '22 at 09:25