1

Lets say that we have a matrix of variables (the columns are variables and rows are the observations) called X whenre X = [x1, x2, ...., xp] where x1, x2, ...xp are variables. If I rotate the X, the X' (the rotated version of X) will have a different Covariance Matrix because x1', x2', ...., xp' are different than the original variables. If the covariance matrix is different, the principal components will also be different. So, why we say that PCA is invariant under rotation?

I am not expert in PCA but I do have good knowledge about PCA. Yet I can not comprehend what does it mean when we say that PCA is invariant under rotation and why this is the case. I have already seen this post (Is PCA invariant to orthogonal transformations?) but it is not still clear.

  • Related: https://stats.stackexchange.com/q/239069/362671 – User1865345 Aug 20 '22 at 10:37
  • I have already seen this post. It is a mathematical explanation of rotation invariant of PCA. My question is what it means intuitively and in practice. – Amir Jalilifard Aug 20 '22 at 11:21
  • That means that if you rotate your data cloud arbitrarily in the space of the variables (= rotate the system of axes, the variables relative the data cloud) and then apply PCA, the results (PC values, I mean) will be the same as PCA of the initial data. This is because PCA is itself a rotation, but is a special rotation, which hierarchically maximizes portions of multivariate variance, which itself does not change with a rotation. So, from any arbitrarily rotated position of the data, still the same PCs are found. – ttnphns Aug 20 '22 at 12:24
  • @ttnphns I have been thinking about this but I see some problem here. Lets say that we have a matrix of variables (the columns are variables and rows are the observations) called X whenre X = [x1, x2, ...., xp] where x1, x2, ...xp are variables. If I rotate the X, the X' (the rotated version of X) will have a different Covariance Matrix because x1', x2', ...., xp' are different than the original variables after rotation. If the covariance matrix is different, the principal components will also be different. – Amir Jalilifard Aug 20 '22 at 14:26
  • 1
    So eigenvectors (the bases) obtained by the decomposition of the covariance matrix will be different. Yes. But the eigenvalues (PC's variances) will still be the same. Also, PC values (scores) obtained by the multiplication of the data matrices and the corresponding eigenvector matrices, will also yield the same (except possibly reversed sign for some components). – ttnphns Aug 20 '22 at 15:04
  • 1
    What the eigenvector matrices (obtained from the covariance matrices) are? They show the degree of rotatedness of the input data relative the principal components of those data. The principal components of X and of X*R (where R is some rotation matrix) are the same, fixed, but the eigenvector matrices are different and their discrepancy, actually, hides (contains) in itself the information on R. – ttnphns Aug 20 '22 at 15:24
  • PCA is not entirely invariant: it is equivariant. When you rotate the original basis for expressing vectors and perform PCA, the principal components have been rotated by the same amount. This is because PCA can be expressed (and often is) in a coordinate-free way as finding a rotation that meets certain criteria. – whuber Aug 20 '22 at 18:26
  • @ttnphns Thank you for the explanation – Amir Jalilifard Aug 20 '22 at 21:27

0 Answers0