4

In this answer @ttnphns writes

Both eigenvectors and loadings are similar in respect that they serve regressional coefficients in predicting the variables by the components (not vice versa!)

, and in a footnote adds that

Since eigenvector matrix in PCA is orthonormal and its inverse is its transpose, we may say that those same eigenvectors are also the coefficients to back predict the components by the variables. It is not so for loadings, though.

I understand the distinction between eigenvectors and loadings, as explained in many posts here. But why is it that we can say

$Component_1 = Eigenvector_{11} \times x_1 + Eigenvector_{21} \times x_2 + ...$

but not

$Component_1 = Loading_{11} \times x_1 + Loading_{21} \times x_2 + ...$

And why is it that the problem doesn’t occur when predicting variables by components?

  • 2
    Either you misunderstood the citations or understood them away from the context. Clearly, since loading is just an eginvector value scaled by a constant factor, you may use loadings in place of eigenvectors as coefficients to compute principal component scores. In your two equations, Component_1 will be collinear with the other Component_1, but the second one will have variance $L_1^2$ while the first one will have variance $L_1$, the 1st eigenvalue. It is unusual to request a PC variance other than equal to its eigenvalue (raw PC) or to 1 (standardized PC). – ttnphns Apr 02 '23 at 17:01

0 Answers0