52

I hope this problem is not considered too "elementary" for MO. It concerns a formula that I have always found fascinating. For, at first glance, it appears completely "obvious", while on closer examination it does not even seem well-defined. The formula is the one that I was given as the definition of the cross-product in $\mathbb R^3 $ when I was first introduced to that concept:

$$ B \times C := \det \begin{vmatrix} {\mathbf i } & {\mathbf j } & {\mathbf k } \\\\ B_1 & B_2 & B_3 \\\\ C_1 & C_2 & C_3\\\\ \end{vmatrix} $$ On the one hand, if one expands this by minors of the first row, the result is clearly correct---and to this day this is the only way I can recall the formula for the components of the cross-product when I need it. But, on the other hand, the determinant of an $n \times n$ matrix whose elements are a mixture of scalars and vectors is undefined. Just think what happens if you interchange one element of the first row with the element just below it. In fact, as usually understood, for a determinant of a matrix to be well-defined, its elements should all belong to a commutative ring. But then again (on the third hand :-) if we take the dot product of both sides of the formula with a third vector, $A$, we seem to get:

$$ A \cdot B \times C = A \cdot \det \begin{vmatrix} {\mathbf i } & {\mathbf j } & {\mathbf k } \\\\ B_1 & B_2 & B_3 \\\\ C_1 & C_2 & C_3\\\\ \end{vmatrix} = \det \begin{vmatrix} A_1 & A_2 & A_3 \\\\ B_1 & B_2 & B_3 \\\\ C_1 & C_2 & C_3\\\\ \end{vmatrix} $$ and of course the left and right hand sides are well-known formulas for the (signed) volume of the parallelepiped spanned by the three vectors, $A, B, C$. Moreover, the validity of the latter formula for all choices of $A$ indicates that the original formula is "correct".

So, my question is this: Is there a rigorous way of defining the original determinant so that all of the above becomes meaningful and correct?

Dick Palais
  • 15,150
  • 11
    There's a natural way to side-step your question in that the cross product is dual (hodge dual + vector space isomorphic to its dual via an inner product) to the wedge product of forms. And your formula is essentially an expression of that duality. In the same way you can define the "cross product" of $n-1$ vectors in $\mathbb R^n$, etc. – Ryan Budney Feb 02 '11 at 18:46
  • 5
    A silly way out is to view your vector space as a bimodule over the field, and then you can compute the determinant without any guilt :) – Mariano Suárez-Álvarez Feb 02 '11 at 18:52
  • Would you be happy with a nice formulaic interpretation of a "Hodge star" isomorphism $\mathbb R^3 \wedge \mathbb R^3 \to \mathbb R^3$ ? Or do you really want determinants defined at some enhanced level of generality? – Ryan Budney Feb 02 '11 at 18:57
  • 1
    http://en.wikipedia.org/wiki/Geometric_algebra – Steve Huntsman Feb 02 '11 at 19:04
  • @Ryan: Well, I think I understand the Hodge star isomorphism reasonably well (at least, I have written a number of articles purporting :-) to explain it to others in various places, e.g., Seminar on the Atiyah-Singer Index Theorem), and yes, I was hoping for an acceptable definition of det that would make everything kosher, but I don't want to dissuade you from answering via Hodge theory. – Dick Palais Feb 02 '11 at 19:05
  • @Steve Huntsman: Thanks, Steve, that certainly looks promising. I had heard of GA before, but it always seemed too formal for my taste and I never looked into it carefully---perhaps this is a good opportunity to look at it more carefully. But, I didn't see any mention in the Wikipedia article about a relation between GA and determinants, although there clearly must be one. Do you know where I can look for that? – Dick Palais Feb 02 '11 at 19:21
  • 3
    It's tricky answering your questions because I figure you've seen everything already. But I don't like making those kinds of assumptions about people, so here we are. :) – Ryan Budney Feb 02 '11 at 19:26
  • 12
    @Ryan Budney: "...It's tricky answering your questions because I figure you've seen everything already." On the contrary, it is absolutely amazing to me (and more than a little humbling) how much I have learned from answers and comments to the questions that I and others have asked here. – Dick Palais Feb 02 '11 at 19:33
  • 3
    @Dick: GA is really just another name for Clifford algebras, and there are determinants everywhere if you do coordinate calculations in a Clifford algebra. For example, the highest graded part of the Clifford product of two (homogeneous) multivectors is the exterior product of those multivectors, and exterior product is related to determinants in a way that you're probably familiar with. – Hans Lundmark Feb 03 '11 at 10:57
  • ...and by the way, I wish I could refer you to the book that a colleague of mine is writing, but unfortunately it is not finished yet: http://www.mai.liu.se/~anaxe/GMA.html – Hans Lundmark Feb 03 '11 at 11:00
  • 2
    @Hans:You said: "GA is really just another name for Clifford algebras" Yes, so I eventually figured out. When I finally started reading about GA, it quickly looked familiar, and finally Clifford algebras got mentioned and I realized why. Why this renaming of a standard, well-known, and well-studied structure? They were a popular topic of study back in the 60s because of their use in the Index Theorem, and I even recall writing a section explaining them in the IAS Seminar on the Atiyah Singer Index Theorem volume. The TOC of your colleagues book looks great ! – Dick Palais Feb 04 '11 at 05:03
  • @Dick: I think the new name was introduced for marketing reasons, and is mainly used by the followers of David Hestenes. – Hans Lundmark Feb 04 '11 at 08:50
  • @HansLundmark The link in your comment is broken. I believe that book you're referring to is now published: Geometric Multivector Analysis: From Grassmann to Dirac, by Andreas Rosén. – Timothy Chow Mar 15 '24 at 18:12
  • @TimothyChow: Yes, that's the one! Andreas moved to another university long ago, so the link I gave probably hasn't worked for the last 10 years at least... (And the book is great, by the way!) – Hans Lundmark Mar 15 '24 at 22:33

5 Answers5

32

But there is a commutative ring available, along the lines of what Mariano says. If $k$ is a field and $V$ is a vector space, then $k \oplus V$ is a commutative ring by the rule that a scalar times a scalar, or a scalar times a vector, or a vector times a scalar, are all what you think they are. The only missing part is a vector times a vector, and you can just set that to zero. The dot product is then a special bilinear form on the algebra. In the formalism, I think that everything that you wrote makes sense.


Theo says in a comment that "even better", one should work over $\Lambda^*(V)$, the exterior algebra over $V$. The motivation is that this algebra is supercommutative. I considered mentioning this solution, and supposed that I really should have, because it arises in important formulas. For example, the Gauss formula for the linking number between two knots $K_1, K_2 \subseteq \mathbb{R}^3$ is: $$\mathrm{lk}(K_1,K_2) = \int_{K_1 \times K_2} \frac{\det \begin{bmatrix} \vec{x} - \vec{y} \\ d\vec{x} \\ d\vec{y} \end{bmatrix}}{4\pi |\vec{x} - \vec{y}|^3}$$ $$= \int_{K_1 \times K_2} \frac{\det \begin{bmatrix} x_1 - y_1 & x_2 - y_2 & x_3 - y_3 \\ dx_1 & dx_2 & dx_3 \\ dy_1 & dy_2 & dy_3 \end{bmatrix}}{4\pi |\vec{x} - \vec{y}|^3}.$$ The right way to write and interpret this formula is indeed as a determinant in the exterior algebra of differential forms. For one reason, it makes it easy to generalize Gauss' formula to higher dimensions.

However, supercommutative is not the same as commutative, and this type of determinant has fewer properties than a determinant over a commutative ring. And different properties. Such a determinant has a broken symmetry: you get a different answer if you order the factors in each term by rows than by columns. (I am using row ordering.) Indeed, the row-ordered determinant can be non-zero even if it has repeated rows. To give two examples, the determinant in the generalized Gauss formula has repeated rows, and the standard volume form in $\mathbb{R}^n$ is $$\omega = \frac{\det ( d\vec{x}, d\vec{x}, \ldots, d\vec{x} )}{n!}.$$

Happily, for Dick's question, you can truncate the exterior algebra at degree 1, which is exactly what I did. This truncation is both supercommutative and commutative.

  • If you define vector to vector as zero and try the same dot product trick with the second row, you would get a wrong answer. – Sergei Ivanov Feb 02 '11 at 21:13
  • @Sergei: ??? Sorry to be so dense, but could you expand on your comment a bit. I'm not sure what you mean by "the same dot product trick with the second row". If you mean take the dot product of the original formula with $B$ then you get $0 = 0$ so you obviously must mean something else. – Dick Palais Feb 02 '11 at 21:53
  • 7
    Even better, one should work over $\Lambda^\bullet V = $ the free commutative (in the super sense) ring generated by V in degree 1. – Theo Johnson-Freyd Feb 03 '11 at 00:11
  • 1
    Theo: I think the point of the question is to expand on should... – François G. Dorais Feb 03 '11 at 00:31
  • This is a beautiful construction and is the basis for a very powerful construction of Quillen called the stabilization, or the tangent category (by others. I think Quillen also called it the "abelianization". – Harry Gindi Feb 03 '11 at 04:34
  • @Harry, it is just the dual numbers. The idea goes back, more or less, to Lagrange and friends! – Mariano Suárez-Álvarez Feb 03 '11 at 06:55
  • @Mariano: Yes, but it's also got lots of extensions! That's what my comment was about. – Harry Gindi Feb 03 '11 at 07:17
  • 3
    Let me restate Greg's last sentence in plain English: If you demand that exactly one row of vectors in $|\cdots|$, then $|\cdot|$ is well-defined on a standard vector space and nothing new is needed. If, however, you want to allow more than one row of vectors, then $|\cdot|$ is still well-defined but takes values in the exterior algebra. – Deane Yang Feb 03 '11 at 18:17
17

Back in the 19th century, when people had been experimenting with determinants a lot, they might have interpreted the above definition of $B\times C$ in terms of quaternions. If $i$, $j$, and $k$ denote basis elements of $\mathbb H$ and $${\mathbf x}=x_1i+x_2j+x_3k,$$ $${\mathbf y}=y_1i+y_2j+y_3k\quad$$ are pure imaginary elements of $\mathbb H$, then the vector part $\Im(\mathbf{xy})$ of the Hamilton product $\mathbf{xy}$ is equal to the determinant

$$\Im(\mathbf{xy})=\Im(\mathbf{x})\times \Im(\mathbf{y})=\det \begin{vmatrix} i & j & k \\\\ x_1 & x_2 & x_3 \\\\ y_1 & y_2 & y_3\\\\ \end{vmatrix}.$$

There is a note by Sir Arthur Cayley where he introduces the notion of a quaternion determinant. He mentions several identities of the form

$$ \det \begin{vmatrix} {\mathbf x} & {\mathbf x} \\\\ {\mathbf y} & {\mathbf y} \\\\ \end{vmatrix} = -2\det \begin{vmatrix} i & j & k \\\\ x_1 & x_2 & x_3 \\\\ y_1 & y_2 & y_3\\\\ \end{vmatrix} $$ and $$ \det \begin{vmatrix} {\mathbf x } & {\mathbf x } & {\mathbf x } \\\\ {\mathbf y } & {\mathbf y } & {\mathbf y } \\\\ {\mathbf z } & {\mathbf z } & {\mathbf z } \\\\ \end{vmatrix} = -2\det \begin{vmatrix} {3} & i & j & k \\\\ x_0 & x_1 & x_2 & x_3 \\\\ y_0 & y_1 & y_2 & y_3\\\\ z_0 & z_1 & z_2 & z_3\\\\ \end{vmatrix} $$ where $\mathbf x$, $\mathbf y$, $\mathbf z$ are arbitrary quaternions $${\mathbf x}=x_0+x_1i+x_2j+x_3k, \mbox{ etc.}$$

Andrey Rekalo
  • 21,997
15

I guess I'm not sure about the difference between "rigorous" and "formal". To me, $|\cdots|$ can be viewed as defining the exterior product of $n$ vectors in an $n$-dimensional vector space. So if you leave the first row blank, then what you have is a linear functional defined by taking the exterior product of $n-1$ vectors. If you fill in the blank row with the basis (that you're writing everything with respect to) $e_1, \dots, e_n$, then clearly taking the dot product of this with an arbitrary $n$th vector gives the exterior product of all $n$ vectors. It follows that this vector is just the Hodge star of the exterior product of the original $n-1$ vectors. But this is just a formal discussion, right?

Deane Yang
  • 26,941
5

Of course, this is not really an answer. Merely a contribution. I like this one too: $X(Y\times Z)^T+Y(Z\times X)^T+Z(X\times Y)^T=\det(X,Y,Z)\,I_3,\qquad\forall X,Y,Z\in k^3.$

Denis Serre
  • 51,599
5

Hi Dick, I don't know if there is one or another way to give a formal meaning to this formula, but I understand it this way:

We have the identity $\det[X\ Y\ Z] = \langle X, Y \times Z\rangle$, where the brackets denotes the scalar product. So, $Y \times Z = \sum_{i=1}^3 \langle e_i, Y \times Z \rangle e_i = \sum_{i=1}^3 \det[e_i \ Y\ Z]e_i$, where $e_i$ are the vectors of the canonical basis, which is your formula, I think.

-- Note: It seems to be the same remark as Deane Yang's

Patrick I-Z
  • 2,259