In this answer @ttnphns writes
Both eigenvectors and loadings are similar in respect that they serve regressional coefficients in predicting the variables by the components (not vice versa!)
, and in a footnote adds that
Since eigenvector matrix in PCA is orthonormal and its inverse is its transpose, we may say that those same eigenvectors are also the coefficients to back predict the components by the variables. It is not so for loadings, though.
I understand the distinction between eigenvectors and loadings, as explained in many posts here. But why is it that we can say
$Component_1 = Eigenvector_{11} \times x_1 + Eigenvector_{21} \times x_2 + ...$
but not
$Component_1 = Loading_{11} \times x_1 + Loading_{21} \times x_2 + ...$
And why is it that the problem doesn’t occur when predicting variables by components?