I am reading a chapter about principal component analysis (PCA). It states that for any random varible $X \in \mathbb{R}^p$ with $n$ observations, $E[X] = \mu$ and $Cov[X] = \Sigma$ the i-th PC is $y_i = \gamma_i^{T} X$, where $\gamma_i$ is the eigenvector corresponding to the i-th largest eigenvalue of the covariance matrix of $X$. Then, it is stated that the PC-transformation is defined as $Y = \Gamma^T (X - \mu)$.
This confuses me because in earlier studies I learned PCA is applied to centered data. So I always thought that the i-th PC corresponds to $y_i = \gamma_i^{T} Z$, where $Z$ is centered data.
So, I think the statement with the PC-transformation is right in the book but the first statement of which defines the PC is wrong and even a contradiction to the statement of the PC-transformation.
Could please somebody make clear whether I am right or wrong.
covariancematrix are those of the data(X-mu)/sqrt(n)[or n-1 here, since we usually use this correction to compute covariance]. While eigenvalues and eigenvectors of theX'X/nmatrix ("mean SSCP" = MSCP matrix) correspond to those of the dataX/sqrt(n). – ttnphns Jul 19 '15 at 12:34sqrt(n)in my comment affect (scale down) eigevalues, not eigenvectors. – ttnphns Jul 19 '15 at 12:49