55
$\begingroup$

One can use the Baire category theorem to show that if $f:\mathbb{R} \to \mathbb{R}$ is differentiable, then $f'$ is continuous at some $c \in \mathbb{R}$. Is there an elementary proof of this fact? By "elementary" I mean at the level of intro real analysis.

Edit: In spite of the decent response this question has gotten, after more than two and a half months there are still no answers. It's perhaps possible that there's some "deep" reason we should not expect an elementary proof of this. I will therefore also accept a well reasoned discussion as to why such a proof is unlikely.

$\endgroup$
11
  • 3
    $\begingroup$ I guess that if there were such a (known) proof, it would be very well known. +1 to this interesting question, though. $\endgroup$
    – ajotatxe
    Commented Sep 22, 2016 at 19:06
  • 3
    $\begingroup$ This is much weaker than the known result that $f'$ is in fact continuous on a dense $G_\delta$ set so I figured there's some hope. $\endgroup$ Commented Sep 23, 2016 at 2:43
  • 2
    $\begingroup$ @chx "Elementary" and "mathoverflow" don't really go together. $\endgroup$ Commented Sep 26, 2016 at 15:10
  • 5
    $\begingroup$ Sure it does. As @ajotatxe says if there would be a proof then it'd be well known so research is needed to find it and mathoverflow is research. It's not like elementary stuff doesn't lead to the darkest corners, check Mochizuki and the abc conjecture. $\endgroup$
    – chx
    Commented Sep 26, 2016 at 17:14
  • 2
    $\begingroup$ that's not true arthur, take a look at $x^2sin(1/x)$ around $0$. Or for something a bit more puzzling take a look at the volterra's function. $\endgroup$
    – Renart
    Commented Sep 27, 2016 at 13:19

1 Answer 1

9
$\begingroup$

One can actually prove with elementary tools something stronger, namely the following

Theorem: If $f:\mathbb{R}\rightarrow\mathbb{R}$ is differentiable in a (non-degenerate) interval $[a,b]$ then $f'$ is continuous at some point $c\in(a,b)$ (a corollary being that if $f$ is differentiable throughout $\mathbb{R}$, then $f’$ if continuous on a dense subset of $\mathbb{R}$).

Proof: For any $[u,t]\subseteq[a,b]$, define $osc(u,t)=\sup_{[w,z]\subseteq[u,t]}\left|\frac{f(t)-f(u)}{t-u} - \frac{f(z)-f(w)}{z-w}\right|$ (with $osc(u,t)=+\infty$ if the right-hand side is unbounded; this can easily be made more formal with some more verbiage). Informally, $osc(u,t)$ tells us how much the slope of $f$ can "oscillate" in the interval $[u,t]$; the essence of the proof is showing that $osc(u,t)$ must converge to $0$ as $u$ and $t$ converge to some point $c$, and that $c$ is then a point at which $f'$ is continuous.

Consider a generic sequence of “concentric and convergent” intervals $[u_i,t_i]$, i.e. one that satisfies $u_i\leq u_{i+1}<t_{i+1}\leq t_i$ and $|t_i-u_i|\rightarrow 0$. Let $u_i,t_i\rightarrow v$ (note that both $u_i$ and $t_i$ are monotone and bounded, so the limit exists); then $\lim \frac{f(t_i)-f(u_i)}{t_i-u_i}=f’(v)$, since $f$ is differentiable in $v$ and thus both $\frac{f(t_i)-f(v)}{t_i-v}$ and $\frac{f(v)-f(u_i)}{v-u_i}$ converge to $f’(v)$, and so $\frac{f(t_i)-f(u_i)}{t_i-u_i}$ which is a convex linear combination of $\frac{f(t_i)-f(v)}{t_i-v}$ and $\frac{f(v)-f(u_i)}{v-u_i}$ also converges to $f’(v)$.

Let us now show that there is a sequence of concentric and convergent intervals $[u_i,t_i]\subseteq (a,b)$ for which $osc(u_i,t_i)\rightarrow 0$. Suppose it were not the case. Then, starting with an arbitrary $[w_0, z_0]\subseteq(a,b)$ there would be an $\epsilon>0$ such that given $[w_i,z_i]$ we could always find a non-degenerate $[w_{i+1},z_{i+1}]\subseteq [w_i,z_i]$ such that $\left|\frac{f(z_i)-f(w_i)}{z_i-w_i} - \frac{f(z_{i+1})-f(w_{i+1})}{z_{i+1}-w_{i+1})}\right|>\epsilon$ (note that $\epsilon$ is independent of $i$). Furthermore, such $[w_i,z_i]$ could always be chosen arbitrarily small, since if $g:\mathbb{R}\rightarrow\mathbb{R}$ is continuous in a non-degenerate interval $[\alpha,\beta]$, then for any $\delta>0$ there exists a non-degenerate interval $[\alpha',\beta']\subseteq(\alpha,\beta)$ with $|\beta'-\alpha'|<\delta$ and $\frac{g(\beta)-g(\alpha)}{\beta-\alpha}=\frac{g(\beta')-g(\alpha')}{\beta'-\alpha'}$ (the proof is essentially identical to that of Rolle's theorem, but stopping before taking the differentiation limit). Thus $\frac{f(z_i)-f(w_i)}{z_i-w_i}$ would not converge to a (finite) limit, contradicting the differentiability of $f$ in $\lim w_i = \lim z_i$.

Then, consider a sequence of concentric and convergent intervals $[u_i,t_i]\subseteq(a,b)$, with $u_i,t_i\rightarrow c$, for which $osc(u_i,t_i)\rightarrow 0$. It is immediate to see that $osc(u_i,t_i)=max\left(\left(\left(\sup_{[w,z]\subseteq[u_i,t_i]} \frac{f(z)-f(w)}{z-w}\right) - \frac{f(t_i)-f(u_i)}{t_i-u_i}\right), \left(\frac{f(t_i)-f(u_i)}{t_i-u_i} - \left(\inf_{[w,z]\subseteq[u_i,t_i]} \frac{f(z)-f(w)}{z-w}\right)\right)\right)$, so if $osc(u_i,t_i)\rightarrow 0$ then $\sup_{[w,z]\subseteq[u_i,t_i]}\frac{f(z)-f(w)}{z-w}, \inf_{[w,z]\subseteq[u_i,t_i] } \frac{f(z)-f(w)}{z-w}\rightarrow \lim \frac{f(t_i)-f(u_i)}{t_i-u_i} = f’(c)$. And since in any interval $[u_i,t_i]$ we have that $\sup_{[w,z]\subseteq[u_i,t_i]}\frac{f(z)-f(w)}{z-w} \geq f' \geq \inf_{[w,z]\subseteq[u_i,t_i] } \frac{f(z)-f(w)}{z-w}$, then $f’(x)\rightarrow f’(c)$ as $x\rightarrow c$, i.e. $f’$ is continuous at $c$.

$\endgroup$
1
  • $\begingroup$ Makes sense to me, so I've accepted it. Well done, but note that I'll keep in eye out in case someone finds a problem with this. $\endgroup$ Commented Jan 2, 2017 at 2:26

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .