11

This is a specialized version of a previous question: Complexity of Finding the Eigendecomposition of a Matrix .

For NxN symmetric matrices, it is known that O(N^3) time suffices to compute the eigen decomposition. The question is: can we achieve sub-cubic complexity? Thanks.

Lihong Li
  • 111
  • 1
  • 4
  • Does this really need a separate question? Surely if someone knew the answer to this special case they would have said so in the other question. – Warren Schudy Nov 18 '10 at 17:11
  • I stressed worst-case in my question, so I think this is fair... – Lev Reyzin Nov 18 '10 at 17:55
  • 2
    Are you sure about that O(N^3) time bound? See my related question about Gaussian elimination. – Jeffε Dec 29 '10 at 21:45
  • It seems from http://mathoverflow.net/questions/24287/what-is-the-best-algorithm-to-find-the-smallest-nonzero-eigenvalue-of-a-symmetric/24294#24294 you can get $O(n^3)$ for an approximate solution. – Lev Reyzin Dec 30 '10 at 20:54

2 Answers2

5

I know this is a really old question, but it seems like this recent paper https://arxiv.org/abs/1912.08805 improves the runtime to $O(n^\omega)$, down from $O(n^3)$.

walkerbacker
  • 136
  • 1
  • 4
2

As I see it, this special case is not easier than the general case. Purely symbolically, you can reduce the problem of finding the singular-value decomposition (SVD) to the problem of diagonalizing a symmetric matrix. One can read off the SVD of M from the eigenvectors and eigenvalues of M* M. Note that the reduction involves only a matrix multiplication to compute M* M. It does not seem that there should be any serious numerical issues.