Has there been any work on finding the minimum number of elementary arithmetic operations needed to compute the determinant of an $n$ by $n$ matrix for small and fixed $n$? For example, $n=5$.
-
4I've asked experts about this, and apparently it is not even currently known whether or not 9 multiplications are needed to compute the determinant of a 3x3 matrix. – Jeffrey Shallit Nov 21 '16 at 17:23
-
@JeffreyShallit If $9$ is possible that is already interesting (as it is less than $n^3$ for example). How about $n=4$? – Simd Nov 21 '16 at 19:36
-
3No, not interesting at all. 9 multiplications for n=3 follows by expansion by minors. For n = 4 again, expansion by minors gives 40. I don't know how to do it in less than 40 multiplications. – Jeffrey Shallit Nov 21 '16 at 22:41
-
@JeffreyShallit Oh I see, good point. It's amazing (at least to me) if nothing better than naive is known for any fixed $n$. – Simd Nov 22 '16 at 08:41
-
If someone knows, perhaps they can tell us. – Jeffrey Shallit Nov 22 '16 at 15:32
-
@JeffreyShallit : Have you read any lower bounds on the number of multiplications needed for that? (I see how to show that just one multiplication isn't even enough for 2x2 matrices, so one multiplication also isn't enough for 3x3 matrices, but as far as I can tell, it could be that 2 multiplications are enough to compute the determinant of a 3x3 matrix.) – Nov 23 '16 at 15:04
-
No, I know of no decent lower bound. Dan Roche is working on it, though. – Jeffrey Shallit Nov 23 '16 at 23:27
1 Answers
It is known that the number of arithmetic operations necessary to compute the determinant of an $n\times n$ matrix is $n^{\omega+o(1)}$, where $\omega$ is the matrix multiplication constant. See for example this table on Wikipedia, as well as its footnotes and references. Note that the asymptotic complexity of matrix inversion is also the same as matrix multiplication in this same sense.
The equivalence is quite effective. In particular, you can recursively compute the determinant of a $n\times n$ matrix by working on $(n/2) \times (n/2)$ blocks using the Schur complement:
$$ \text{$D$ invertible}\qquad\implies\qquad\det \begin{pmatrix}A&B\\C&D\end{pmatrix} = \det(D) \det(A-BD^{-1}C). $$
Thus, you can compute an $n\times n$ determinant by computing two $(n/2)\times (n/2)$ determinants, inverting one $(n/2)\times (n/2)$ matrix, multiplying two pairs of $(n/2)\times (n/2)$ matrices, and some simpler operations. Expanding the determinant calls recursively, the complexity ends up being dominated by the matrix multiplication and inversion.
This works well even for small $n$ and even without using sub-cubic matrix multiplication algorithms. (Of course, it ends up being more-or-less equivalent to Gaussian elimination.) For example, for $n=4$, we can compute $\det(D)$ with two multiplications, $D^{-1}$ with four divisions, $BD^{-1}C$ with $2\times 8=16$ multiplications, $\det(A-BD^{-1}C)$ with two multiplications, and the final answer with one multiplication. The total number is $2+4+16+2+1=25$ multiplications plus divisions, which is less than the $40$ from cofactor expansion. Using Strassen's algorithm saves two multiplications here, but more asymptotically.
You may notice that this expansion crucially uses division. If you wish to avoid division, then one can do so in $O(n^4)$ operations by working with Clow sequences and dynamic programming. It is also known how to achieve $n^{2+\omega/2+o(1)}$ multiplications and no divisions.
- 276
- 2
- 5
-
Do you know any lower bound on just the number of multiplications? Even for n = 3? – Jeffrey Shallit Nov 30 '16 at 10:52
-
Your phrasing "the number of arithmetic operations necessary to compute the determinant of an $n \times n$ matrix is $n^{\omega +o(1)}$" suggests that a lower bound is known. But I did not see this in any of the cited works. Am I missing something? – Jeffrey Shallit Nov 30 '16 at 11:00
-
2The lower bound is in the paper by W.Baur and V.Strassen "The complexity of partial derivatives" (http://dx.doi.org/10.1016/0304-3975(83)90110-X) – Vladimir Lysikov Nov 30 '16 at 12:38