7
$\begingroup$

I'm thinking about teaching calculus by firstly introducing the asymptotic notations (big-Oh, little-oh, and $\sim$), secondly explaining their "arithmetic" (things like how to sum little-oh's and similar), and then doing everything else (for example, the derivative of $f$ in $x_0$ would be defined as the real number $f^\prime(x_0)$, if it exists, such that $f(x) = f(x_0) + f^\prime(x_0) (x - x_0) + o(x - x_0)$ for $x \to x_0$).

Note that, of course, while explaining little-oh's I would implicitly explain the definition of limit to $0$.

In my experience, asymptotic notations are usually introduced to students after most of the theory have been shown, and mostly as a tool to compute limits. So I wonder if this alternative approach have been tried before and could be fruitful. Some pro/cons I see:

Pro:

  • Asymptotics notations give a concise language to express many concepts ("bounded" is $O(1)$, "infinitesimal" is $o(1)$, the expression for the derivative that I already mentioned...) and work with them in an almost mechanical way.

  • Students have to learn them anyway soon or later

Cons:

  • Asymptotic notations could be a bit difficult to understand at first, because they are not reflexive (things like $o(x^2) = o(x)$ but $o(x) \ne o(x^2)$ for $x \to 0$).
$\endgroup$
15
  • 16
    $\begingroup$ I don’t think this is good enough for an answer, but do let me know if you think otherwise. I’m not sure introducing a notation that messes with the usual meaning of “=“ is a good idea. A large proportion of people I know learning derivatives still think that “=“ means “here is the next thing” and it’s hard to teach them it means “is the same as”. Introducing a situation where it means something else again is sure to make it even harder! $\endgroup$ Commented Oct 13, 2019 at 13:11
  • 7
    $\begingroup$ To say that $o(x^2) = o(x)$ but not $o(x) = o(x^2)$ is a terrible abuse of notation. Wouldn't it be more correct to write $o(x^2) \subseteq o(x)$ but $o(x) \not\subseteq o(x^2)$? $\endgroup$
    – Xander Henderson
    Commented Oct 13, 2019 at 14:21
  • 5
    $\begingroup$ Also, I contest your assertion that "Students will have to learn [asymptotic notation] anyway soon[er] or later." I have yet to need asymptotic notation for much of anything---I learned a little bit about big-o notation when I took numerical analysis as a masters student, but even that was pretty informal, and not necessary for the work I was doing. $\endgroup$
    – Xander Henderson
    Commented Oct 13, 2019 at 14:56
  • 5
    $\begingroup$ My advice for teaching calculus. Get a good textbook, and follow it very closely. Do not think of using different ways to teach it until you have extensive experience with the level of students you intend it for. $\endgroup$ Commented Oct 13, 2019 at 21:43
  • 7
    $\begingroup$ I appreciate the beauty and simplicity of the big and little o notation. However, the thought of grading work which makes a technical distinction between o and O to an audience which writes things like $d/dx = 2x$ when $f(x)=x^2$, it gives me pause... $\endgroup$ Commented Oct 13, 2019 at 22:34

5 Answers 5

10
$\begingroup$

When I was an undergraduate, the big and little oh notations were taught to me in a first-year math course ostensibly aimed at physics students. The class was strong - several of us are now mathematics and physics professors in universities - and the attempt was, to my mind, moderately unsuccessful (although one could make the counterpoint that I still remember it clearly 27 years later).

Part of the problem is that the big and little oh notation/terminology is a good example of not very good notation/terminology. Notation and terminology work best when they signal their own meanings, and when they easily distinguish similar cases. The big and little oh are too similar to easily distinguish mentally, and when written on the blackboard they often are genuinely indistinguishable. It's better to say simply "bounded" and "infinitesimal" than to write O(1) and o(1)! Particularly in a first-year class, one is unlikely to need to discuss more than these two special cases anyway.

In a classroom setting, with reference to Taylor approximations these notations are usually used as a way of formalizing physicists's dropping of higher order terms in a way that makes mathematicians feel good about themselves for not breaking the rules, rather than in a way that makes anything clearer to students. For students early in their careers, I think it's usually better to take the time to write out in more detail what such notation encodes.

$\endgroup$
5
  • 2
    $\begingroup$ "Part of the problem is that the big and little oh notation/terminology is a good example of not very good notation/terminology." Except that they are extensively used in Analysis, and an entire field of mathematics and computer science relies on the big Oh notation... "when written on the blackboard they often are genuinely indistinguishable." Mine are clearly distinguishable. $\endgroup$
    – Jorssen
    Commented Oct 13, 2019 at 15:51
  • 13
    $\begingroup$ @Jorssen: That notation/terminology are widely used is by no means argument that they are well chosen. There is plenty of horrible notation in common use. That you write O and o in such a way that they are clearly distinguishable is by no means a guarantee that others do so as well. $\endgroup$
    – Dan Fox
    Commented Oct 13, 2019 at 16:55
  • 2
    $\begingroup$ "That you write O and o in such a way that they are clearly distinguishable is by no means a guarantee that others do so as well." I don't know about other teachers, but any student who writes little-oh or Big-oh in indistiguishable way gets zero score. That's a pretty good guarantee. $\endgroup$
    – Jorssen
    Commented Oct 13, 2019 at 17:33
  • 3
    $\begingroup$ @Jorssen Professors on the blackboard are not the same people in the same setting as students taking tests or writing assignments. I had a professor who couldn't differentiate between $p$ and $\rho$, in a vector analysis class where pressure, density, and the relations between them were the main topic of many lectures. We still had to distinguish them on our test. I can't imagine that $O$ and $o$ are any different. $\endgroup$
    – Arthur
    Commented Oct 14, 2019 at 6:12
  • $\begingroup$ Notation and terminology work best when they signal their own meanings --- For this reason I found myself using $f \ll g$ and $f \approx g$ in 2nd semester calculus when covering series convergence and convergence of improper integrals (e.g. limit comparison tests and the like, although I'd have to indicate how $f \approx g$ was a more general notion than the usual textbook theorem restrictions in which the ratio $f/g$ approaches a positive and finite limit). Note that "big Oh" is not one of these, but I never found I needed "big Oh" for what I wanted to do. $\endgroup$ Commented Oct 14, 2019 at 19:00
8
$\begingroup$

I think that there may be advantages to introducing the idea of asymptotics early, and I think that it might be interesting to experiment a bit with curriculum by bringing in asymptotics in calculus. However, there are caveats:

  1. I think that students need to have a solid understanding of limits and continuity, first. I don't necessarily mean that the students need to be rattling off $\varepsilon$-$\delta$ proofs at the drop of a hat, but they should have a good intuition about what limits are, how they behave (e.g. the limit of a sum is the sum of the limits, &c.), and how limits and continuity are related. Most importantly, the students should demonstrate a clear understanding of the "Squeeze Theorem", since a lot of the discussion of asymptotics reduces to applications of the Squeeze Theorem.

  2. Asymptotics should be introduced gently, without reliance on notation. For example, when teaching precalculus, a fair amount of time is spent discussing how to graph rational functions. When considering the behaviour of a graph at infinity, our current curriculum suggests a notation like $$ \frac{(x+2)^2(3x+4)(7-x)^5}{(x-2)^4(5x-2)^2} = \frac{-3x^8 + \text{junk}}{25x^6 + \text{junk}}, $$ where "$\text{junk}$" would be more rigorously stated as "lower order terms", or could be written in terms of big-Oh notation. In precalculus, we are pretty hand-wavy about what this actually means (though we can make it more rigorous if we really need to). In a calculus class, this could (and should) be made much more rigorous, as the students are expected to know how to take limits. Examples like this could motivate the introduction of special notation for asymptotics. Further motivation comes from L'Hospital's rule.

    In any event, the underlying point is that the idea of asymptotics should arise naturally, and come from a desire to simplify the notation for the purposes of computation.

  3. Do not start using ambiguous or abusive notation until your students are very comfortable with the basics. Equality should always be a reflexive relation. If you mean that $f(x) \in O(x)$, then write that. Don't write $f(x) = O(x)$. Calculus students are already trying to assimilate a lot of new information and ideas. Calculus is a hard class for many students, particularly when we consider that it is a terminal class for many students (at my current institution, maybe half of the students in calculus end up taking further mathematics course, and less than 10% end up taking upper division mathematics).

    It was argued in comments by the original asker that asymptotic notation comes up a lot in analysis and computer science. Suppose that we accept this claim—so what? The vast majority of students in every elementary calculus class I've ever taught are neither mathematics nor CS majors, and the math and CS majors who are present are likely to learn everything they need to about notation for asymptotics when it actually comes up in future classes. Note that I am not arguing (here) against teaching Landau notation—rather, I am arguing that this is not a good argument in favor of teaching it.

    My recommendation in these kinds of classes is to introduce as little notation and jargon as is humanly possible. Don't ask students to use ambiguous notation. Don't introduce notation which you are not going to use. Don't introduce terms that you aren't going to use later. Make sure that the exposition is clear and unambiguous. Save abuses of notation for later.

  4. Calculus is a prerequisite for other classes at many, many institutions. Future instructors are going to assume certain knowledge, based on an understanding of a curriculum which is pretty well standardized across the United States (and, presumably, elsewhere, as well). If you decide to introduce asymptotics with the associated big-o and little-o notations, make sure that you do so without losing out on any of the other curriculum which your students are going to need to know for future classes. Remember that you, as an instructor, are a member of a community of educators—you are responsible not only to your students, but also to your colleagues, who are going to have to teach your students in future classes.

$\endgroup$
7
$\begingroup$

I've never known or needed that notation in my life. And I've used lots of calculus for science and engineering. So one of your assumptions is wrong. I think it would be more efficient to teach a normal calculus class. Leave the esoteric abstractions for later, and for the kids who will really need them (a small subset of the typical calculus population, and not always already sorted in freshman year).

Even for the kids who will eventually need it, I suspect it is more efficient to let them get a working knowledge of the practical calculus first. I don't understand this urge to cram real analysis into calculus class. It ignores practical psychological pedagogy. (It's hard to learn something in its hardest fashion first.)

$\endgroup$
3
  • 1
    $\begingroup$ Likewise, don't remember using in when I was in college. But this notation is very popular in the computer world, and relative complexity of an algorithm is often phrased in Big-O terms. $\endgroup$
    – Rusty Core
    Commented Oct 13, 2019 at 18:53
  • $\begingroup$ I honestly can't imagine doing college level calculus or physics without the little o. $\endgroup$
    – IcedLance
    Commented Oct 14, 2019 at 13:41
  • $\begingroup$ The OP isn't proposing teaching real analysis or teaching a harder version of calculus. They're simply proposing using a different way of teaching calculus. $\endgroup$
    – user507
    Commented Oct 14, 2019 at 21:43
1
$\begingroup$

I agree with most of what's already been said so far, but wanted to add one point---that the answer is surely going to depend on your audience. Of course, this is going to be people who have never learnt calculus before so their mathematical maturity and exposure to such things is definitely going to be limited, but depending on where you're teaching the ability of students to pick new things up is going to differ widely.

If, in your opinion, you are teaching a bunch of exceptional undergrads/college students who can pick anything up easily and with little risk of permanent confusion, I don't see why you shouldn't teach this notation. Even if, like the other answers said, it might not be of crucial importance, it is certainly good-to-know especially for those who would later pursue deeper studies in math, physics or (especially) computer science.

On the other hand, if you don't think too highly of the ability level of (most of) your students, then I think this is a horrible idea. When there are so many other things new to students which are already going on in a typical calculus class, adding things which risk even more confusion is the worst thing you can do. Even as someone who is reasonably more experienced than your typical calculus freshman student, seeing things like $o(x^2)=o(x)$ but $o(x)\neq o(x^2)$ can still make me uncomfortable until I take a bit more time to stare at the equation to figure out what is the intended meaning. Adding this confusing (or perhaps even contentious: see Xander Handerson's answer and comments) notation to a class which already may be struggling with things like taking derivatives and manipulating integrals does not seem like the best idea to me.

In short, always have your audience in mind. If your class is made up of stellar, amazing students, then by all means go for it. Most likely though, most of them are not---in which case big and little oh notation are best left to a course after calculus.

$\endgroup$
0
$\begingroup$

Don't. The language is highly imprecise, confusing, contrary to the precise usage you hopefully will be teaching them, and in my experience not useful for doing calculus.

$\endgroup$
5
  • 2
    $\begingroup$ Big-O notation is not imprecise. It has a rigorous mathematical definition. $\endgroup$
    – user507
    Commented Oct 14, 2019 at 21:44
  • $\begingroup$ @BenCrowell: The common notation is imprecise shorthand for something that can be made precise. $\endgroup$ Commented Oct 14, 2019 at 21:45
  • 1
    $\begingroup$ Big-oh and little-oh notation is so imprecise and not useful for doing calculus that thare are thousands of scientific papers (in Analysis) and many symbolic softwares using it. $\endgroup$
    – Jorssen
    Commented Oct 15, 2019 at 13:21
  • $\begingroup$ @Jorssen: IOW it's such that experts in the field are able to understand the shorthand and reconstruct the rigorous argument it's standing in for. That does not make it suitable for the context OP is asking about and it does not make it precise. $\endgroup$ Commented Oct 15, 2019 at 13:24
  • 2
    $\begingroup$ How is $e^x = 1 + x + x^2 / 2 + o(x^2)$ as $x \to 0$ imprecise ? There can be disagreement about the pedagogical value of Big-Oh and little-oh notation, but claiming that Big-Oh and little-oh notation is imprecise is simply erroneous. I also point out that you completely ignored the part of my reply that supports the usefulness of such notation. $\endgroup$
    – Jorssen
    Commented Oct 15, 2019 at 14:04

Not the answer you're looking for? Browse other questions tagged or ask your own question.