25

There is a well known essay by Dray and Manogue which argues that differentials should be brought back into freshman calculus, and that we shouldn't worry too much about choosing a specific way of formalizing them, or about giving students a formal answer to the question of "what are differentials?"

I find myself in almost complete agreement with Dray and Manogue, and I feel especially that differentials are the natural way to approach related rates and implicit differentiation. Since differentials are used universally in the sciences and engineering, I think freshman calc students should be exposed to them. I would like to throw in just a taste of differentials when doing related rates and implicit differentiation, without going whole hog and using them throughout the course.

But I will admit to quite a bit of apprehension about actually going ahead with such an approach. I suspect that, for example, I would get some push-back from colleagues who believe that differentials are inherently inconsistent; that students would go to tutors for help, and that the tutors wouldn't understand the approach; and that the kind of casualness about foundational aspects advocated by Dray and Manogue would be interpreted as carelessness or incompetence on my part.

Has anyone implemented anything like what Dray and Manogue advocate, and if so, what were their practical experiences like?

Related: A calculus book that uses differentials? [I was surprised to find that Keisler's treatment of implicit differentiation looks exactly like a standard one, without differentials or infinitesimals.]

Mikhail Katz
  • 2,242
  • 15
  • 23
  • 8
    I for one was thoroughly confused by the (informal) infinitesimals rampant in engineering. .. – vonbrand May 11 '14 at 00:49
  • 3
    Agreed, I always was confused by them too. – DumpsterDoofus May 11 '14 at 02:17
  • @vonbrand: What did you find confusing? Did you eventually overcome your confusion? Would different freshman calc preparation have helped you to avoid the confusion? –  May 11 '14 at 04:15
  • @BenCrowell, getting a rigorous first calculus course with no "approaches," "very small" and other did help clean up some misconceptions from earlier and self-study. Later I was able to use those heuristically to set up needed equations (knowing I could do it right if needed, and keeping in mind that they are essentially linear terms of Taylor series). – vonbrand May 11 '14 at 10:11
  • Having a consistent approach (not teacher talking one language, TAs and textbooks another, other students learning a different way) is probably mor important than anything else. I was lucky to have a teacher who managed to shield us from this by making the $\epsilon$-$\delta$ approach natural – vonbrand May 11 '14 at 10:21
  • 1
    A good question. But how to define "differential" in an intellectually honest (i.e. rigorous) way? – Jyrki Lahtonen May 12 '14 at 06:58
  • 6
    This is all very interesting! I wrote my Bachelor's Thesis about Leibniz's development of differentials because my pre-university teacher had promised to tell us a little about what they meant, but he never did. BTW I am Danish. We also never defined limits with $\varepsilon,\delta$ before I got to university level. It was all $\Delta x$ approaches this and that. Very disturbing, I think! – String May 12 '14 at 07:47
  • @JyrkiLahtonen: But how to define "differential" in an intellectually honest (i.e. rigorous) way? The essay by Dray and Manogue lists a number of approaches. One is to use NSA; this approach is developed in the book by Keisler http://www.math.wisc.edu/~keisler/calc.html , which is free online. –  May 12 '14 at 14:59
  • Thanks, @Ben. I will definitely take a look. – Jyrki Lahtonen May 12 '14 at 15:02
  • @BenCrowell thanks for the link to that paper. I very much agree with their sentiments concerning linearization: "This invaluable tool should be emphasized, not relegated to a recently-invented role in linear approximation, for which it is neither suited nor needed." – James S. Cook May 13 '14 at 11:24
  • 2
    While I'm all in favor of teaching calculus with differentials, I don't agree with everything Dray and Manogue say. In particular, it seems to me that in addition to their many other uses, differentials are precisely suited for linear approximation, and that describing their role in that area is a nice complement to their use in the chain rule, as mentioned in David Butler's answer. – Mike Shulman Jul 03 '14 at 05:17

6 Answers6

17

A faculty member in my department is a former student of Dray's and 2-3 years ago showed me the differentials approach. He and I both use it quite extensively in our first semester calculus courses.

Here's the pros and cons (in no particular order) and comments on them.

Pro:

1) It's generally awesome. Students in my class struggle a lot less with chain rule, implicit differentiation, related rates, optimization, u-substitution, and integration by parts. (I could go on about how awesome it is, but that's not part of the original question.) In fact, the students that struggle the most are the ones who are retaking it (either again in college or just because they took it in high school) and won't embrace the differentials method. They usually end up making what I assume are the same mistakes they made the first time they took the course.

2) In Calculus II, it makes even more sense. Integration techniques make more sense because $dx$ or $du$ are not just marking the end of the integral, they actually mean something. For the geometrical computations you can talk generally and then have students apply it to the given situation. For example, instead of two formulas for volumes by cylindrical shells, I give them one: $\int 2\pi r h dr$. Then it's up to them to figure out what $r$, $h$, and $dr$ represent geometrically. Likewise, it makes introducing parametric and polar curves easier since students already are comfortable with equations that aren't $y=f(x)$. Also, it greatly opens up the applications of integration section. Work = (Force)*(distance) now becomes $dW = s \cdot dF$ (or $dW = F \cdot ds$ if appropriate, "What is the small quantity?", "Which are you assuming to be constant on your slice?"), which is really how advanced physics/engineering classes think about such relations.

Pros/Cons (depending on how you look at it):

1) It makes calculus extremely algebraic to the point that you really don't have to understand much of the concepts going on. As long as you know what algebraic fact you need/are using (e.g. solving for $dA/dt$ or setting $dA=0$), it's just algebra. Of course, a lot of students are weak in their algebra skills...

2) Transition to Calculus II: If they have me again, no big deal. If they have someone else, I would guess they might stutter a bit in the beginning. But I usually use $d/dx$ at least once or twice around the end of the course and only once in the past 3 years have I had a student ask me what that meant. When I explained it meant, "Take a d, then divide by dx" they gave an "oh, duh" response.

3) Students mix up $3x^2$ and $3x^2dx$, i.e. they sometimes will write a function when they mean a differential or vice versa. If they write a differential and mean a function, it usually isn't too big a deal, since if they evaluate the function they then drop the $dx$. If they write a function and needed a differential, then the error is a problem because they typically need to solve for a ratio of differentials or something and they don't have the right pieces to do so. All in all, I see this as a small trade-off.

4) Students will get confused when they look at the book. I had a lot of problems when I first started using differentials because students would look to the book to do the homework. Now I have notes that are NOT my examples from lecture, but the examples in the book re-worked using differentials. I've had far less problems since. I'm hoping in an upcoming sabbatical to rewrite an open-source Calc book to put it into a differentials flavor...

5) Colleagues' opinions. Well, as I mentioned above, maybe I'm lucky that there's two of us out of the 6-7 Calc I instructors we have at our university. The others either think it's fine (as long as they don't have to do it) or are interested in using bits and pieces (but won't commit to going all out). Nobody challenges the rigor of it because, honestly, given our student population, there's little rigor in our Calc I anyway.

6) Tutors. Usually I warn the tutors who haven't had me what they might see when students come in. Our tutors are typically junior or senior math majors. When I show them the differential approach they take to it very quickly (after only a couple of examples) and usually wonder why everybody doesn't do it that way.

So that's my (more than) 2 cents. I could go on and on, but I think that answers most of your questions.

Aeryk
  • 8,013
  • 18
  • 44
  • 6
    Nice answer. I'd love to see your open-source book when it's done. –  May 12 '14 at 15:00
  • Two questions: 1. Have you find the time to write the Calc book you mentioned? I would be curious/interested to see your approach. 2. Quite unrelated to the original question... I don't remember seeing an infinitesimal work defined as $dW=s\cdot dF$. Instead of (force)(distance) I would rather define work as (force)(displacement). Would you have an example (or some theoretical settings) where $dW$ could be identified as $s\cdot dF$? – The Quark Jul 18 '23 at 14:03
  • 1
  • No. I made a bunch of lecture videos during Covid, but I gave up on writing a text. 2. The classic pulling a chain up a building is dW=sdF. The infinitesimal work to raise a slice of chain is the distance that slice travels times it's infinitesimal weight.
  • – Aeryk Jul 20 '23 at 02:40
  • Ah, too bad. Thank you for your answers. – The Quark Jul 20 '23 at 10:28