13

The title is the question. This technique involves using the "matrix of cofactors", or "adjugate matrix", and gives explicit formulae for the components of the inverse of a square matrix. It is not easy to do by hand for a matrix bigger than, say, $3\times 3$. For an $n\times n$ matrix, it requires computing the determinant of the matrix itself and computing $n^2$ determinants of $(n-1)\times(n-1)$ matrices. So I'm guessing it is not that useful for applications. But I would like confirmation.

I am not asking about the theoretical significance of the technique in proving theorems about matrices.

Nico Schlömer
  • 3,126
  • 17
  • 36
Stefan Smith
  • 515
  • 3
  • 13

2 Answers2

12

You're right -- it has absolutely no practical relevance for computing. Even if computing the determinant was an $O(n)$ operation, the complexity of the method would be at least $O(n^3)$ and, consequently, of the same complexity as Gaussian elimination. In practice, computing the determinant of a matrix is actually of exponential complexity, making this method completely unusable.

Wolfgang Bangerth
  • 55,373
  • 59
  • 119
  • 4
    Two things I want to add: The complexity of Cramer's Rule (using determinates to calculate an inverse) is $O(n!)$ which is much much much larger than Gaussian Elimination $O(n^3)$. Also, in general, you don't want to calculate an inverse unless you absolutely have to. – Paul May 27 '13 at 02:30
  • OTOH, there might be some circumstances where Laplace expansion might be preferred, e.g. banded matrices. But indeed, in general, Laplace expansion has $O(n!)$ complexity. – J. M. May 27 '13 at 02:30
  • 3
    @Stefan, yes, Gaussian elimination can be used to compute determinants. Since $\det(\mathbf A\mathbf B)=\det(\mathbf A)\det(\mathbf B)$, and Gaussian elimination produces (triangular) factors whose determinants are easily computed, it will indeed take $O(n^3)$ effort. – J. M. May 27 '13 at 03:55
  • 1
    Yes, you are correct -- the determinant can be computed at the cost of an $LU$ decomposition. (The naive way shown in text books using the recursive expansion is exponential in $n$ -- the $n!$ complexity mentioned by Paul). But that still yields an overall complexity of $O(n^5)$ for the proposed algorithm -- far more than the Gaussian elimination, if one were to use it, and even more than iterative solvers. – Wolfgang Bangerth May 27 '13 at 04:43
  • LU decomposition is exactly the reduction to triangular form. – Wolfgang Bangerth Sep 08 '13 at 05:52
  • @WolfgangBangerth : Thanks, but my point is, if you had to do it by hand, I'm guessing it takes much more bookkeeping to find $L$, $U$, and $P$ (three matrices) than to merely do the task I described in my comment above. In that task, you lose information, but all you're trying to do is find the determinant. I'm guessing that the computational complexity of the two tasks is equal so that the difference doesn't matter in practice, since you're probably using computers and not working by hand. – Stefan Smith Sep 11 '13 at 22:57
  • 1
    Correct. Row reduction is one half of computing the $LU$ decomposition. It reduces $A$ to the $U$ factor. The other half of the work is doing the same operations starting from the identity matrix, yielding the $L$ matrix. It's true that you can avoid the latter if all you care about is the determinant. – Wolfgang Bangerth Sep 12 '13 at 05:43
  • @WolfgangBangerth There is no additional cost, $L$ comes basically for free in an LU decomposition. Its entries are $L_{ik}=A^{(k)}{ik}/A^{(k)}{kk}$, where $A^{(k)}$ is the matrix obtained after $k$ steps of Gaussian elimination to get the $U$ factor, and this is a division that one already has to make to compute $A^{(k+1)}$. So the only cost is writing it down. Zero flops. – Federico Poloni May 02 '15 at 08:50
  • @FedericoPoloni: You're splitting hairs. For me, Gaussian elimination = LU decomposition. If you consider these separate, then yes, you get the L factor for free as you do Gaussian elimination. But the overall cost is of course still that of Gaussian elimination. – Wolfgang Bangerth May 03 '15 at 18:45
  • @WolfgangBangerth: Out of sheer curiosity, the Laplace expansion for finding the determinant has the undeniable advantage of requiring less memory than Gaussian elimination / LU decomposition, right? The only memory cost of the Laplace expansion is bookkeeping the permutation, while LU has quadratic memory costs. Do you know how we can go in terms of memory? – shuhalo Apr 05 '17 at 15:13
9

I'm going against the crowd - the adjugate matrix is in fact very useful for some specialty applications with small dimensionality (like four or less), in particular when you need the inverse of a matrix but don't care about scale.

Two examples include computation of an inverse homography and Rayleigh quotient iteration for very small problems (which in addition to being simplified by use of adjugate is numerically better).

nonbasketless
  • 171
  • 1
  • 1
  • I fully agree, there are some cases ( in general with small matrices ) where it helps a lot ! ( for instance, for computing barycentric coordinates in a small simplex ) – BrunoLevy Nov 06 '15 at 20:43