Here is how you can try to get RSS without finding the regression coefficients
Lets say the problem is:
$$
\begin{align}
\mathbf{Y}&=\mathbf{X}\cdot\boldsymbol{\beta}+\boldsymbol{\rho}\\
\left(\begin{array}\\Y_1\\Y_2\\\vdots\\Y_n\end{array}\right)&=
\left(\begin{array}\\
X_{11} & X_{12} & \dots & X_{1p} \\
X_{21} & X_{22} & \dots & X_{2p} \\
\vdots \\
X_{n1} & X_{n2} & \dots & X_{np} \\
\end{array}\right)\cdot\left(\begin{array}\\\beta_1\\\vdots\\\beta_p\end{array}\right)+\left(\begin{array}\\\rho_1\\\rho_2\\\vdots\\\rho_n\end{array}\right)
\end{align}
$$
Where $\boldsymbol{\rho}$ are your residuals. You can decompose $\mathbf{X}$ using singular value decomposition into an $n\times n$ orthogonal matrix $\mathbf{U}$, a $p\times p$ orthogonal matrix $\mathbf{V}$ and a diagonal matrix $\mathbf{D}$. It is a good idea to impose regularization condition where you define certain diagonal elements small enough to be zero:
$$
\mathbf{X}=\mathbf{U}\cdot \left(\begin{array}\\
d_1 & 0 & 0& 0 & 0 &\dots & 0 \\
0 & d_2 & 0& 0 & 0 &\dots & 0 \\
0 & 0 & \ddots& 0 & 0 &\dots & 0 \\
0 & 0 & 0& d_s & 0 &\dots & 0 \\
0 & 0 & 0& 0 & 0 &\dots & 0 \\
\dots\\
0 & 0 & 0& 0 & 0 &\dots & 0 \\
\end{array}\right) \cdot \mathbf{V}
$$
One can then apply an orthogonal transformation:
$$
\begin{align}
\mathbf{Z}&=\mathbf{U}^T \cdot \mathbf{Y} \\
\mathbf{r}&=\mathbf{U}^T \cdot \boldsymbol{\rho} \\
\boldsymbol{\alpha}&=\mathbf{V}^T\cdot\boldsymbol{\beta}
\end{align}
$$
Then we have:
$$
\begin{align}
\left(\begin{array}\\Z_1\\Z_2\\\vdots\\Z_s\end{array}\right)&=
\left(\begin{array}\\
d_1 & 0 & \dots & 0\\
0 & d_2 & \dots & 0\\
0 & 0 & \ddots & 0 \\
0 & 0 & \dots& d_s\\
\end{array}\right)\cdot\left(\begin{array}\\\alpha_1\\\vdots\\\alpha_s\end{array}\right)+\left(\begin{array}\\r_1\\\vdots\\r_s\end{array}\right)\\
\left(\begin{array}\\Z_{s+1}\\\vdots\\Z_{n}\end{array}\right)&=
\left(\begin{array}\\r_{s+1}\\\vdots\\r_{n}\end{array}\right)
\end{align}
$$
You have a freedom to choose any $\alpha_{\dots}$ and, by construction $d_{1\dots s}\neq 0$, so residuals $r_1\dots r_s$ can all be set to zero. Due to orthogonality of $\mathbf{V}$ then:
$$
\boldsymbol{\rho}^T\cdot\boldsymbol{\rho}=\rho_1^2+\dots+\rho^2_n=\mathbf{r}^T\cdot\mathbf{r}=Z_{s+1}^2+\dots+Z_{n}^2=\sum_{i=1\dots n} Y_i^2-\left(Z_1^2+\dots + Z_s^2\right)
$$
Let $\mathbf{v}_j$ be the j-th column vector of matrix $\mathbf{V}$. It then follows that:
$$
\mathbf{X}^T\cdot\mathbf{Y}=\left(\mathbf{v}_1\,\mathbf{v}_2\,\dots\,\mathbf{v}_s\right)\cdot
\left(\begin{array}\\
d_1 & 0 & \dots & 0\\
0 & d_2 & \dots & 0\\
0 & 0 & \ddots & 0 \\
0 & 0 & \dots& d_s\\
\end{array}\right)\cdot \left(\begin{array}\\Z_1\\Z_2\\\vdots\\Z_s\end{array}\right)
$$
So that:
$$
Z_{i=1\dots s}=\frac{\mathbf{v}_i^T\cdot \mathbf{X}^T\cdot \mathbf{Y}}{d_i}
$$
Thus:
$$
\rho_1^2+\dots+\rho^2_n=\sum_{i=1\dots n} Y_i^2-\sum_{j=1\dots s}\frac{\left(\mathbf{v}_j^T\cdot \mathbf{X}^T\cdot \mathbf{Y}\right)^2}{d_j^2}
$$
This is what you are after, I believe. You still need a way of extracting $\mathbf{v}$ and $d$
Finally, the non-normalized covariance matrix is:
$$
\left(n-1\right)\mathbf{C}=\mathbf{X}^T\cdot\mathbf{X}=\mathbf{V}^T\cdot \left(\begin{array}\\
d_1^2 & 0 & 0& 0 & 0 &\dots & 0 \\
0 & d_2^2 & 0& 0 & 0 &\dots & 0 \\
0 & 0 & \ddots& 0 & 0 &\dots & 0 \\
0 & 0 & 0& d_s^2 & 0 &\dots & 0 \\
0 & 0 & 0& 0 & 0 &\dots & 0 \\
\dots\\
0 & 0 & 0& 0 & 0 &\dots & 0 \\
\end{array}\right)\cdot\mathbf{V}
$$
So all one needs to do is diagonalize the covariance matrix, and work with its non-zero eigenvalues and eigenvectors
Since I need only RSS, I hoped I could get it by running p simple linear regressions as I described above, but it doesn't work.
– James Sep 11 '23 at 17:09