Questions tagged [eigenvalues]

For questions involving calculation or interpretation of eigenvalues or eigenvectors.

Questions involving the calculation or interpretation of eigenvalues should use . This may include factor analysis, principal components analysis or regression, or other model estimation functions that require a positive definite matrix (of which all eigenvalues are positive). In factor analysis, a factor's eigenvalue $=\sum($loadings on that factor$)^2$; eigenvalue $\div\sum$ eigenvalues $=$ % total variance explained by the factor.

From Wikipedia:

An eigenvector of a square matrix A is a non-zero vector v that, when the matrix is multiplied by v, yields a constant multiple of v, the multiplier being commonly denoted by λ. That is:

A v = λ v

The number λ is called the eigenvalue of A corresponding to v.

In the context of factor analysis, a factor's eigenvalue is the sum of all variables' squared loadings on that factor. The factor loading is the correlation of the variable with the factor. The squared loading is the variance explained in the variable by the factor. The factor's eigenvalue divided by the sum of all eigenvalues is the proportion of total variance explained by the factor. From Wikipedia:

If a factor has a low eigenvalue, then it contributes little to the explanation of variances in the variables and may be ignored as redundant with more important factors.

All of the above is also true of principal components analysis. Principal components regression eliminates components with small eigenvalues for the similar purpose of reducing the dimensionality of a set of regressors.

Many criteria for identifying an appropriate threshold exist, and vary in their utility.

435 questions
9
votes
1 answer

Does the sign of eigenvectors matter?

I have a matrix: $$ \left[ \begin{array}{ccc} 2 & 1 & 1 \\ -11 & 4 & 5 \\ -1 & 1 & 0 \\ \end{array}\right] $$ I got the eigenvalues to be $[-1,1,2]$. For the eigenvalue $-1$, I got an eigenvector $[0,1,-1]$. On the solutions, it says the correct…
Alex
  • 91
4
votes
2 answers

Rank of Between Class Scatter Matrix in Linear Discriminant Analysis

In the derivation for Fisher's linear discriminant (the 2 class problem in particular), I notice that the between-class scatter matrix $S_B$ is said to have rank of at most 1. What is the significance of this fact to LDA process. Does this fact have…
Minaj
  • 1,421
2
votes
0 answers

How do I check if the spectral radius of a matrix is greater than 1?

I have a real matrix whose two largest eigenvalues are two conjugate complex numbers. I need to check if their absolute value is greater than 1. Since the largest by absolute value eigenvalue is complex, the power method, and any other method that…
seed
  • 131
0
votes
1 answer

Singular value and Eigen value for Square Matrix

Is it true that for a square symmetric matrix such as the covariance matrix, the singular values are equal to the eigenvalues? The eigen decomposition for covariance is the same as singular value decomposition?
0
votes
0 answers

Combining two eigenfaces of the same person

What exactly would the benefit of having more than one eigenface to work with when trying to do facial recognition? Say I have an average face already prepared from my set of faces, and I know that many of the faces are different pictures of the…
0
votes
1 answer

Is there a typo in this paper on Slow feature analysis?

In this picture you can see the formula (red rectangle added by me for emphasis): $$ \textbf{V}^\intercal\textbf{HV} = \textbf{D} $$ Should not this rather be (eigenvalue decomposition): $$ \textbf{V}^\intercal\textbf{DV} = \textbf{H} $$ The first…
0
votes
2 answers

Whats the meaning of $O(1)$ and $O(n)$?

I have the equation: \begin{equation} X_{t}=\Lambda F_t+u_{t}, \end{equation} with: \begin{equation} Cov(X_t)=\Lambda Cov(F_t) \Lambda' + Cov(u_t) \end{equation} where $F_t$ is an $r \times 1$ vector, $\Lambda$ is a $n\times r$ matrix and $u_t$ is …