11
$\begingroup$

I have the following question. Let $f,g:\mathbb{R}\to\mathbb{R}$ be two continuous functions (vanishing at infinity) and assume that $$ \frac{f(x+t)+f(x-t)-2f(x)}{t^2}\to g(x) $$ for all $x\in X$ when $t\to 0$. Apparently, this implies that $f$ is twice continuously differentiable. The argument I read was something in the following lines: the Schwartz second order derivative is equal to $g$ and by the continuity of $g$ and the de la Valley-Poussin theorem one has that $f$ has ordinary second order derivative equal to $g$ which apparently proves the claim.

Is somebody able to explain this to me? I cannot see how the theorem applies here or why this should be true. Of course, if $f$ is already twice continuously differentiable then the limit exists but this is not part of the assumption.

Thanks a lot in advance for your help.

$\endgroup$
4
  • 1
    $\begingroup$ I don't understand. Maybe you means $+$ between $f(x+t)$ and $f(x-t)$ ? $\endgroup$
    – G. Melfi
    Commented Oct 25, 2023 at 7:29
  • 1
    $\begingroup$ @G.Melfi yes, indeed it has to be a plus not a minus. I amended the question. Thanks and I excuse the inconvenience caused. $\endgroup$ Commented Oct 25, 2023 at 7:40
  • 3
    $\begingroup$ @SonamIdowu, Can you provide some sources for the claim that if the limit condition is satisfied at every point and g is continuous then f′′=g ? $\endgroup$ Commented Oct 25, 2023 at 14:00
  • $\begingroup$ Though they do not answer the question (AFAICT), the answers to this other question (as well as this almost identical one on MSE) about the “exact Peano derivative” seem at least somewhat relevant here. $\endgroup$
    – Gro-Tsen
    Commented Oct 25, 2023 at 16:30

2 Answers 2

11
$\begingroup$

This is more of a long comment than answer. First, the analogous statement for the first derivative is already non-trivial, although not very difficult, see Aull, Charles E. "The first symmetric derivative." The American Mathematical Monthly 74.6 (1967): 708-711.

Second, denote $$ (L_th)(x)=\frac{h(x+t)+h(x-t)-2h(x)}{t^2}; $$ then if $\phi$ is a test function, we have $$ \int_\mathbb{R}f\phi''=\int_\mathbb{R}f\lim_{t\to 0}L_t\phi\stackrel{(1)}{=}\lim_{t\to 0}\int_\mathbb{R}fL_t\phi=\lim_{t\to 0}\int_\mathbb{R}(L_tf)\phi\stackrel{(2)}{=}\int_\mathbb{R}\lim_{t\to 0}(L_tf)\phi=\int_\mathbb{R} g\phi. $$ This would imply that $f''=g$ in the distributional sense. That is, $f$ coincides with a suitable second anti-derivative of $g$ in distributional sense. By applying Du Bois Reymond's lemma twice, we see that they actually coincide as functions. But since $g$ is continouos, its second anti-derivative is twice continuously differentiable.

Of the two exchanges of the limit and the integral, (1) is clear, since for smooth functions $L_t\phi\to \phi''$ uniformly, and (2) is a trouble - I don't see how to deduce it from the conditions at hand. But the argument shows that if we are willing to assume a bit more, e.g., that the convergence $L_t f\to g$ is uniform, then the result follows easily.

UPD: the actual argument is much simpler. Let $a<b$, and let $G$ be the second anti-derivative of $g$ such that $(f-G)(a)=(f-G)(b)=0.$ We need to show that $f\equiv G$ on $[a,b]$. Assume that $\max_{[a,b]}(f-G)>0$. Then, $$\max_{[a,b]}H_\epsilon>0$$ for $\epsilon>0$ small enough, where $H_\epsilon(x)=(f(x)-G(x)-\epsilon (b-x)(x-a))$. But $\lim_{t\to 0}L_tH_\epsilon \equiv 2\epsilon$ for any $t$ and any $\epsilon.$ So, for any $x,$ we have $H(x+t)+H(x-t)-2H(x)>0$ for $t$ small enough. Applying this to the point on $(a,b)$ where $H_\epsilon$ attains its maximum gives a contradiction.

$\endgroup$
6
  • 1
    $\begingroup$ Are you now claiming an affirmative answer without assuming uniform convergence to g? $\endgroup$ Commented Oct 25, 2023 at 16:18
  • $\begingroup$ Thank you very much Kostya_I. Do you think uniform convergence on compact sets is sufficient? $\endgroup$ Commented Oct 25, 2023 at 18:09
  • 1
    $\begingroup$ @MikhailKatz The argument under UPD works just fine with merely pointwise convergence, as stated in the OP. After some research, the result apparently goes back to Schwarz himself... $\endgroup$
    – Kostya_I
    Commented Oct 25, 2023 at 18:28
  • $\begingroup$ Nice proof...does a similar argument exist for the first derivative? $\endgroup$ Commented Oct 25, 2023 at 20:24
  • $\begingroup$ @Kostya, can you elaborate on your research? This could be included in the body of your answer. $\endgroup$ Commented Oct 26, 2023 at 12:28
4
$\begingroup$

One approach would be to use geodesic curvature. For a suitably smooth function $h(t)$, the curvature of its graph in the plane is $k(t)=\dfrac{h''(t)}{(1+h'(t)^2)^{3/2}}$. Since the expected second derivative $g$ is continuous, we can integrate it to get the expected first derivative, and then plugging it into the formula above will give the expected value of the curvature at each point. But by the fundamental theorem of the theory of curves (here we have the planar case which is easier), given a prescribed curvature function there is a unique curve with that curvature. Then by uniqueness it would have to be the graph of our function, and by construction first and second derivatives exist everywhere.

$\endgroup$

Not the answer you're looking for? Browse other questions tagged or ask your own question.