5

This question is a follow-up to this one.

Let $A\in \mathbb{R}^{n\times n}$ be large, sparse, symmetric and positive definite. Suppose for I already know $m<n$ eigenpairs of $A$, corresponding to the largest eigenvalues, i.e. $\{( \lambda_i, \vec{q}_i) \}_{i=1}^m \subset \mathbb{R}^+\times \mathbb{R}^n$.

I need to solve $A \vec{x}=\vec{b}$ for $x$.

The idea is that we expand $\vec{x}$ as $\vec{x}=\sum_{i=1}^m \chi_i \vec{q}_i + \sum_{i=m+1}^n \chi_i \vec{q}_i$. Now since the eigenvectors are orthogonal, we can take the dot product of both sides of the equation with $\vec{q}_j$, say, and immediately recover that $\chi_j = \vec{q}_j\cdot \vec{b}/ \lambda_j$. So we define

$$\vec{x}_\mathrm{guess} = \sum_{i=1}^m \frac{\vec{q}_i\cdot \vec{b}}{\lambda_i} \vec{q}_i,$$ and use it as the first guess to any iterative solver.

I would be shocked if this isn't a standard idea, but I don't know what it is called. What is it's name? I would especially like to know what papers to cite.

fred
  • 1,000
  • 6
  • 12

2 Answers2

6

This is something like a truncated SVD or eigenvector expansion of your solution. If you take

$$x_m = \sum_{j=1}^m \frac{q_j\cdot b}{\lambda_j}q_j$$

with $m=n$, this is the exact solution to $Ax=b$. If you take instead the $m<n$ eigenvalues in the sum instead, you get an approximation. The problem is that it's a pretty poor approximation; since your eigenvalues are sorted from largest to smallest, $1/\lambda_n$ is the largest term in your series and your error is

$$\|x-x_m\| = O(1/\lambda_{n})$$

Intuitively, small eigenvalues correspond to low-frequency components in systems, which tend to have the largest components. Approximating your solution using higher frequency (larger magnitude eigenvalues) gives a good approximation of a matrix-vector product, but not the solution of a linear system.

Jesse Chan
  • 3,132
  • 1
  • 13
  • 17
  • I think I agree with everything you wrote. I appreciate your insights as to why this may not be a standard technique. But, respectfully, this doesn't really answer my question. Is that what this technique is called, a "truncated eigenvector expansion"? Any pointers to the literature? – fred Oct 23 '15 at 15:02
  • You come across fine, don't worry :). For literature, it's pretty standard in most numerical linear algebra books - Trefethen and Bau for example. It's just a dyadic decomposition of an SVD, and for SPD matrices, this is just the eigenvalue expansion. I'm not sure there's a real name for it. – Jesse Chan Oct 23 '15 at 15:35
  • A side note: I suspect that your observation about the fact that "low frequency" components in the solution should be "dominant" is correct if you suppose that $Ax=b$ is the discretisation of some well behaved problem (e.g a PDE) in which the unit matrix is a good proxy for a mass matrix. In that case, in fact, higher order eigenvalues are just numerical artifacts, and should NOT be used for approximating a solution. Generally speaking, without knowledge of the properties of $Ax=b$, it is hard to draws conclusions on the effectiveness of the proposed approach. – Stefano M Oct 27 '15 at 15:09
6

I don't know for sure whether there is an existing name for this method, but @jessechan's suggestion of "truncated eigenvector expansion" sounds perfectly fine to me (and most people would understand it).

Your question had a second part embedded, namely why is this not what everyone does to get a good initial guess for iterative solvers? The answer to this is that for all interesting cases, computing eigenvectors and eigenvalues is an extremely expensive proposition, far more expensive than doing a few more iterations (which are typically rather cheap). As a consequence, within our community of computational scientists, we most frequently don't try very hard to get good initial guesses for iterative solvers for linear system. If you happen to have a good initial guess (e.g., the previous solution's timestep, or an extrapolation of the previous two time steps), then use it; if you have to work to get an initial guess, forget it -- it's going to be too expensive.

Wolfgang Bangerth
  • 55,373
  • 59
  • 119