Let $b$ denote the OLS estimator, and assuming normality of the errors we get $b \sim \mathop{\mathcal N}\left(\beta, \sigma^2\left(X'X\right)^{-1}\right)$, where $X\in\mathbb R^{N\times K}$ is of full column rank $K$. Now, for $R\in\mathbb R^{J\times K}$ with $J\leq K$ it must be$$Rb\sim \mathop{\mathcal N}\left(R\beta,\sigma^2R\left(X'X\right)^{-1}R'\right)$$which then implies $\left(Rb-R\beta\right)'\left[\sigma^2 R\left(X'X\right)^{-1}R'\right]^{-1}\left(R\beta-R\beta\right)\sim\chi^2_J$.
My question is, how do we know that $R\left(X'X\right)^{-1}R'$ is invertible? $R$ is assumed to be full row rank $J$, and $X'X$ is a symmetric invertible matrix, but I don't see how this implies that the $J\times J$ product is of full rank/invertible. I have seen the derivation from different sources including SE and nobody's mentioned this.