8
$\begingroup$

I'm writing (together with a colleague) a minicourse on mathematical analysis (currently we want to cover the Weierstrass theorem on functions on compact intervals, so the aim is to present only the material needed for that particular result; we want to enlarge the scope later). I'm a bit stuck on the definition of a limit of a sequence. Does anyone know any intuition or metaphor which would naturally motivate the epsilon-ish definition? Ideally, after reading it, the student would think "yeah, this is obviously how this should be defined." I have an idea or two, but I do not like them too much, so I decided to ask here.

$\endgroup$
1
  • 2
    $\begingroup$ See my answer to the math StackExchange question Why limits work. In that answer I gave the complete text of Anecdotes for limits by James Clyde Bradford [School Science and Mathematics 59 #3 (March 1959), p. 218], which I thought had some merit, but apparently no one else thought so because it's at the bottom of the answers with 0 upvotes. $\endgroup$ Commented Dec 12, 2014 at 18:30

6 Answers 6

9
$\begingroup$

My feeling is that the $\epsilon$-$\delta$ formulation is already pretty close to what one should think about limits; that is, the language can be hard to grasp at first, but the idea is very literally expressed in that language and can also be expressed in words. I find this more convincing using a physics measurement approach (which is slightly more natural for limits of functions).

So, we think of numbers as measurements by a physical device, which can have arbitrary (non perfect) precision. We have a sequence of such measurements $(u_n)$; we will say that this sequence has limit $\ell$ if, however precise our device is, there is a time (rank) after which all our subsequent measurements are indistinguishable from $\ell$: our device cannot tell them apart.

Another approach would be game-theoretic [edit: as pointed out in the comment, this had been proposed by Terence Tao on MO; in any case I do not pretend than any of the ideas in this post are original ideas of mine ]. Player A wants to prove that $u_n$ has limit $\ell$, and player B wants to disprove it. Player B plays by proposing an error, as small as she wants but non-zero. Player A then answers with a rank, as large as she wants. If all subsequent numbers in the sequence are closer to $\ell$ than specified by the error chosen by B, than A wins the round, otherwise she looses. One says that $(u_n)$ has limit $\ell$ if A can win all rounds, whatever the strategy of B is.

$\endgroup$
4
  • 1
    $\begingroup$ I like the game analogy---Nice! $\endgroup$ Commented Dec 12, 2014 at 14:31
  • $\begingroup$ @JosephO'Rourke and Benoît: The game analogy reminds me of paragraph two in T. Tao's answer to MO 38639. $\endgroup$ Commented Dec 12, 2014 at 16:43
  • $\begingroup$ @BenjaminDickman: you are right; I had not this paragraph in mind when I wrote it, but in any case I cannot pretend this idea is mine (I do not know if it is Tao's or if it was already used before). Thanks for noticing that. $\endgroup$ Commented Dec 12, 2014 at 18:24
  • $\begingroup$ I think the conception has been around for quite some time; it is at least as old as "The Epsilon Delta Game" (revision date: Jan. 1, 2000) mentioned in the Wolfram Library here, but it would be interesting to trace it back further. (There is a reasonable bound on such a historical search provided by the introduction of delta-epsilon proofs...) $\endgroup$ Commented Dec 13, 2014 at 19:59
3
$\begingroup$

The natural language description of a limit is usually "the number that the values in the sequence get closer and closer to".

The phrase "closer and closer" strongly indicates a time element to sequences. That is, $n$ increasing is the same as time moving forwards. I imagine the values of the sequence as dots on a number line, with a new dot appearing every second.

sequence using time

The picture in my head of a sequence converging on a limit is the dots appearing one at a time and them being literally closer to the limit. It wouldn't do to periodically have a dot appear that is far away from the limit, so we need to be sure that all future dots will be as close as we are now. Just one or two future dots being further out would be fine, as long as at some future time all of them are closer. This has to work no matter how close we are - otherwise it's not "and closer".

And there is the idea of the epsilon. It has a limit L if no matter the $\varepsilon$, there is a time such that all future values are at least that close to L. I imagine it as zooming in super close to L and then waiting to see if the dots start appearing in the zone I am looking at. If there is a time past which all future dots will be in view, then this $\varepsilon$ is good. We need this to work no matter how far we zoom in.

time sequence with epsilon zone

I guess it comes about because of an attempt to pin down what it means to be "closer" and what it would look like if it didn't get closer.

$\endgroup$
1
  • 2
    $\begingroup$ One thing I watch out for is that 'closer and closer' gives the connotation or impression that convergence need be monotone, leading some to suggest that $( \sin n)/ n$ doesn't converge as $n \to \infty$ since the next term is often technically slightly further from zero than the last. 'Arbitrary close' tends to convey the point while also conveniently sidestepping this reading too deep. $\endgroup$ Commented Nov 17, 2015 at 22:53
2
$\begingroup$

This is very similar to the answer Benoit Kloeckner gave above, but perhaps more informal (and hence more intuitive?). I like to explain the limit of a sequence in the form of a dialogue between two people, A and B:

A: As this sequence of values continues, it gets closer and closer to 12.

B: How close?

A: What do you mean?

B: How close does it get to 12?

A: As close as you want.

B: Does it ever get so close that the values stay always between 11.9 and 12.1?

A: Yes. You just have to go out past the 230th term in the sequence. From that point on, the values are always between 11.9 and 12.1

B: Does it ever get so close that the values stay always between 11.99 and 12.01?

A: Sure. Just wait until you get past the 12,042nd term in the sequence. From that point on, the values are always between 11.99 and 12.01

B: Well, does it ever get so close that the values always stay between --

A: Look, let me make it simple for you. You pick any tiny little window around 12 that you want, no matter how small, and I guarantee you that I can find some term in the sequence beyond which the values always stay within that window. Okay?

B: Got it.

The $\epsilon - N$ definition of convergence is pretty much a straightforward transcription into formal language of A's last sentence in this dialogue.

$\endgroup$
1
$\begingroup$

Maybe a topological approach would be better: The limit of a sequence is $L$ if and only if any open interval containing $L$ eventually contains the rest of the sequence. Of course, making this more rigorous in a metric space immediately gives the $N-\epsilon$ definition. But the intuition is that eventually the sequence never gets far away from $L$.

$\endgroup$
1
$\begingroup$

Here is my "unfinished" attempt that I used in one of my classes last year. Generally speaking, it is based on the idea of "proof-generated definitions" introduced by Lakatos. As such, the question is which “theorem” could justify and motivate the definition of a convergent sequence. In fact, the first theorem that comes to mind works: if a sequence converges, then its limit is unique. Obviously, you might say we need "a definition for a limit of a sequence" to prove this "obvious" statement. That would be the whole point of a proof-generated definition. My unfinished attempt can be found here.

$\endgroup$
-1
$\begingroup$

Maybe you could contrast $\sin(x)/x$ and $\sin(1/x)$ as $x \to 0$, arguing that the former stays within an $\epsilon$-ball (vertical $y$-interval) while it approaches $1$ (somewhat nonobviously) from the right/$+x$, whereas the latter oscillates and does not stay within an $\epsilon$-ball, and in fact covers the $y$ interval $[-1,1]$ as $x \to 0$...?


            SinPlots


$\endgroup$
4
  • $\begingroup$ This seems to be an image for "limit as $x \to 0$", whereas the OP asked for "limit of a sequence". $\endgroup$
    – mweiss
    Commented Dec 12, 2014 at 2:50
  • $\begingroup$ @mweiss: Point taken, but of course the values of the functions as $x \to 0$ form a sequence. $\endgroup$ Commented Dec 12, 2014 at 10:48
  • $\begingroup$ Which values, though? One can certainly choose a sequence of values $x_i$ that approaches $0$, and then look at the sequence of values of $f(x_i)$, and talk about the limit of that sequence... But that limit may exist even if the limit as $x \to 0$ does not exist at all. In order to bridge the gap between "limit as $x \to 0$" and "limit of a sequence" you have to consider every possible sequence $x_i$ that goes to $0$. This seems to take a fairly simple concept (limit of a single sequence) and embed it in an explanation of a much more complicated one (limit of a fcn at a point). $\endgroup$
    – mweiss
    Commented Dec 12, 2014 at 19:32
  • $\begingroup$ @mweiss: Good point; you are right. I stand corrected. $\endgroup$ Commented Dec 13, 2014 at 0:07

Not the answer you're looking for? Browse other questions tagged or ask your own question.