12

Consider a symmetric positive definite tridiagonal linear system $$A x = b$$ where $A \in \mathbb{R}^{n \times n}$ and $b \in \mathbb{R}^n$. Given three indices $0 \le i < j < k < n$, if we assume only equation rows strictly between $i$ and $k$ hold, we can eliminate intermediate variables to get an equation of the form $$u x_i + v x_j + w x_k = c$$ where $v > 0$. This equation relates the value of $x_j$ to $x_i,x_k$ independent of "outside" influence (say, if a constraint affecting $x_0$ was introduced).

Question: Is it possible to preprocess the linear system $Ax = b$ in $O(n)$ time so that the linking equation for any $(i,j,k)$ can be determined in $O(1)$ time?

If the diagonal of $A$ is 2, the offdiagonals are $-1$, and $b = 0$, the desired result is the analytic result for the discretized Poisson equation. Unfortunately, it is not possible to transform a general SPD tridiagonal system into a constant coefficient Poisson equation without breaking the tridiagonal structure, essentially because different variables can have different levels of "screening" (locally strict positive definiteness). A simple diagonal scaling of $x$, for example, can eliminate half of the $2n-1$ DOFs of $A$ but not the other half.

Intuitively, a solution to this problem would require arranging the problem so that the amount of screening could be accumulated into a linear size array and then somehow "cancelled" to arrive at the linking equation for the given triple.

Update (more intuition): In terms of PDEs, I have a discretized linear elliptic problem in 1D, and I want to know whether I can spend $O(n)$ in precomputation to produce some sort of "analytic" solution that can be looked up in $O(1)$ time, where I am allowed to vary where the boundary conditions are.

Geoffrey Irving
  • 3,969
  • 18
  • 41

3 Answers3

2

Here's a somewhat unstable solution that only works when the coupling between variables is always nondegenerate. Assume for simplicity that $b = 0$. First, precompute the $n$ linking equations for $(0,i,n-1)$ for $0 \le i < n$, say

$$x_i = a_i x_0 + b_i x_{n-1}$$

Now, given $i < j$, we can combine the $i$th and $j$th linking equations and eliminate $x_{n-1}$ to get

$$\begin{aligned} b_j x_i &= a_i b_j x_0 + b_i b_j x_{n-1} \\ b_i x_j &= a_j b_i x_0 + b_i b_j x_{n-1} \\ b_j x_i - b_i x_j &= (a_i b_j - a_j b_i) x_0 \\ x_i &= \frac{a_i b_j - a_j b_i}{b_j} x_0 + \frac{b_i}{b_j} x_j \end{aligned}$$

This process can be repeated once more to eliminate $x_0$ given $(i,j,k)$. Unfortunately, we lose stability near $b_j = 0$, or in general if the tridiagonal system decouples into independent blocks. If $b_j = 0$ this is no problem, but I'm worried about breakdown for tiny but positive values.

Geoffrey Irving
  • 3,969
  • 18
  • 41
  • Having implemented this, I can confirm that (1) it works in exact arithmetic and (2) it is extremely unstable. Intuitively, this solution does a bunch of extrapolation of exponential functions, which breaks the nice interpolatory character of elliptic problems. – Geoffrey Irving Aug 11 '13 at 01:00
  • It seems like your approach is to precompute something like the Green's function for all inner indices. It is then no surprise that you will have trouble when $b_j\approx 0$, since information about the boundary values can hardly propagate to the point of interest. I don't think there would be a general way around this. It seems that you might be better off making a tree structure (perhaps this is $n\log n$ precomputing effort) which allows you to get the Green's function for subsections of the domain to bypass potential trouble spots. – Victor Liu Aug 15 '13 at 21:24
  • The tree version is $O(n)$ precompute plus $O(\log n)$ per triple. Unfortunately I'm specifically searching for linear time solutions. – Geoffrey Irving Aug 16 '13 at 00:58
2

I wonder if you could do something useful with a cyclic-reduction factorization of A (which I believe is still O(n) size), reusing as much of the blocks that would remain unchanged when factoring a contiguous principal submatrix of A. I doubt it gives you O(1), but maybe O(log n)...

  • Yeah, the $O(\log n)$ solution is immediate, but sadly breaks the desired paper title ("A linear time direct solver for convex tridiagonal quadratic programs with bound constraints"). – Geoffrey Irving Aug 15 '13 at 18:21
  • No chance of amortization helping you out? – Robert Bridson Aug 15 '13 at 18:30
  • There's a lot of other amortization going on, so it's quite possible. I don't know how yet, though. – Geoffrey Irving Aug 15 '13 at 18:35
  • This is what I would need to amortize away the cost: http://cstheory.stackexchange.com/questions/18655/maintaining-the-product-of-a-queue-of-semigroup-elements. – Geoffrey Irving Aug 16 '13 at 01:28
  • Great! Someone posted a wonderful solution to that cstheory question, so I shouldn't need the answer to this question anymore. The semigroup multiplication operation in that question is eliminating an intermediate variable. – Geoffrey Irving Aug 16 '13 at 06:52
1

Here's another attempt, which is more stable than the cancellation method but still not very good.

If $A$ is an SPD tridiagonal matrix, Meurant [1] gives the following stable formula for the entries of $B = A^{-1}$

$$B_{ij} = b_{i+1}\cdots b_j \frac{d_{j+1}\cdots d_n}{\delta_i \cdots \delta_n}$$

where $i \le j$, $b_i$ are the negative offdiagonal entries and $d_i,\delta_i$ are derived from $UL$ and $LU$ factorizations of $A$. The linking formula for $i < j < k$ has the form

$$x_j = \left(\begin{array}{c}B_{ji} \\ B_{ki}\end{array}\right)^T \left(\begin{array}{cc}B_{ii} & B_{ik} \\ B_{ki} & B_{kk}\end{array}\right)^{-1} \left(\begin{array}{c}x_i \\ x_k\end{array}\right)$$

Unfortunately, this formula remains unstable. Intuitively, if $i$ and $k$ are reasonably close a delta source at $i$ is similar to one at $k$, and the inverted $2 \times 2$ matrix is close to singular.

[1]: Gerard Meurant (1992), "A review on the inverse of symmetric diagonal and block tridiagonal matrices".

Geoffrey Irving
  • 3,969
  • 18
  • 41