3

Is there a method for finding the number of eigenvectors with a given eigenvalue? I do not need the eigenvectors themselves, and to find the eigenvectors seems quite tough, given the comments on the answer to the question Compute eigenvectors of a matrix with known eigenvalue spectrum.


I have many randomly-generated adjacency matrices $M$ with dimensions roughly $10^5$ x $10^5$ with elements $0$ or $1$. I can show analytically that these matrices are very likely to have many (roughly $10^3$) eigenvalues that are exactly $1$. The other eigenvalues are within a roughly symmetric eigenspectrum about $0$ and have a rough spacing of around $10^{-4}$ between adjacent eigenvalues, including near $1$. I am interested in the number of degenerate eigenvalues at $1$.

I'm tempted to try shift-invert with Scipy's sparse package which uses ARPACK and LAPACK, though I've often found that shift-invert struggles with interior eigenvalues when the level spacing is small.

However, I am wondering if there's a way to avoid interior eigensolving altogether and to estimate instead the number of degenerate eigenvalues through more clever means.

As an aside, for my particular problem, I am happy with approximate algorithms so long as the error in the degeneracy is bounded by $5$% of the true degeneracy, or, maybe more practically, if a $95$% confidence interval has a width of $10%$.

user196574
  • 133
  • 4
  • If this is a symmetric matrix with entries 0/1, then it seems to me that there can only be a 1 in each row, otherwise you'd have eigenvalues larger than 1. Is that correct? Or is there some scaling missing in your description? – Federico Poloni Nov 02 '21 at 07:53
  • 3
    Anyhow, if you are doing this to compute the number of connected component of a graph, probably a "normal" connectivity check via a graph visit would be cheaper. – Federico Poloni Nov 02 '21 at 07:55
  • @FedericoPoloni There can be eigenvalues larger than $1$; I just expect there to be a very large number of eigenvalues that are exactly $1$. – user196574 Nov 02 '21 at 23:06

1 Answers1

5

If you know $\lambda$, then the eigenvalue problem $Ax=\lambda x$ comes down to finding a vector $x$ so that $(A-\lambda I)x=0$. In other words, you want to characterize the null space of the matrix $B=A-\lambda I$. If all you care about is the multiplicity of $\lambda$, what you are asking is about the dimension of the null space of $B$.

You can do this via a rank-revealing QR decomposition, for example. That may still be more work that you are looking to do, though.

Wolfgang Bangerth
  • 55,373
  • 59
  • 119
  • Thanks, this is useful, since I know there exist certain rank estimating algorithms, like scipy's estimate_rank. The output rank being about $8x$ larger is a bit too much for that particular algorithm, but I think I'd be good with $5% error, which I'll add to the question statement. – user196574 Nov 02 '21 at 06:30
  • @user196574 If I read the docs correctly, "8 larger" means $r+8$, not $8r$. – Federico Poloni Nov 02 '21 at 23:24
  • @FedericoPoloni Thanks, I misread that! I'll give it a try for some smaller toy matrices to see if it works speedily. – user196574 Nov 02 '21 at 23:34
  • 2
    @FedericoPoloni Confirming that scipy's estimate rank worked well for my sparse matrices. – user196574 Nov 12 '21 at 10:21