25
$\begingroup$

For instance, the absolute value function is defined and continuous on the whole real line, but its derivative behaves like a step function with a jump-discontinuity.

For some nice functions, though, such as $e^x$ or $\sin(x)$, the derivatives of course are no "worse" than the original function.

Can I say something that is typical of the derivative? Is it typically not as nice as the original function?

$\endgroup$
7
  • 8
    $\begingroup$ worse in what sense ? At least derivative of Analytic functions is not only worse but I can say sometimes it s better than original one! $\endgroup$
    – Red shoes
    Commented Oct 8, 2017 at 0:47
  • 1
    $\begingroup$ @Redshoes in what sense is it better? I think they have exact same radius of convergence. $\endgroup$
    – Vim
    Commented Oct 8, 2017 at 2:17
  • $\begingroup$ @Vim: It's more well-behaved, right? A line is better-behaved than a parabola, etc. $\endgroup$
    – user541686
    Commented Oct 8, 2017 at 6:09
  • $\begingroup$ @Mehrdad most analytic functions aren't finite polynomials so no, I don't think this's what Red Shoes meant. $\endgroup$
    – Vim
    Commented Oct 8, 2017 at 6:10
  • 3
    $\begingroup$ In systems and control theory, there is a general statement about avoiding derivatives and differentiators as much as possible. Because they insert noise and high-frequency signals into system which are hardly controllable/detectable/predictable/stabilizable and so on. $\endgroup$
    – polfosol
    Commented Oct 8, 2017 at 7:28

4 Answers 4

41
$\begingroup$

Yes, that is completely right. And inversely, integration makes functions nicer.

One way of measuring how "nice" a function is, it by how many derivatives it has. We say a function $f\in C^k$ if it is $k$ times continuously differentiable. The more times differentiable it is, the nicer a function is. It is "smoother". So if a function is $k$ times differentiable then its derivative is $k-1$ times differentiable. A function is "as nice" as its derivative if and only if its smooth (infinitely differentiable). These are functions like $\sin(x), e^x$, polynomials, etc.

Inversely, integration makes things nicer. For example integrating even a non continuous function results in a continuous function: Is an integral always continuous?

$\endgroup$
3
  • $\begingroup$ except in instances where the function doesn't have a nice integral and you're left with some nasty sum or product or something. $\endgroup$
    – tox123
    Commented Oct 8, 2017 at 15:56
  • 7
    $\begingroup$ @tox123 See also this MathOverflow answer. As noted in a comment there, if you're thinking about it in terms of formulas then integration can certainly lead to nastier results. But the function itself will still be nicer, or at least as nice, as the function you started with, in the way described in this answer. $\endgroup$
    – Carmeister
    Commented Oct 8, 2017 at 17:18
  • 4
    $\begingroup$ It's worth pointing out, for the unwary reader, that the MathOverflow post @Carmeister linked to is an answer to a question asking for harmful heuristics, and as such the bolded sentence there is not bolded for emphasis but to highlight it as a dangerous belief. $\endgroup$
    – user856
    Commented Oct 8, 2017 at 21:04
14
$\begingroup$

The concept you're talking about is called smoothness (Wikipedia, MathWorld).

Functions like $e^x$ and $\sin(x)$ and polynomials are called "smooth" because their derivatives, and $n$th derivatives are all continuous. Smooth functions have derivatives all the way down, so they're as nice as their derivatives.

But functions like $\operatorname{abs}(x)$ and $\operatorname{sgn}(x)$ aren't smooth since there are discontinuities in either them or their derivatives. They're nicer than their derivatives.

A function is in class $C^k$ if its derivatives up to, and including, $k$ are continuous. So the number of levels of nice-ness will depend on $k$. Think about how integrating $\operatorname{sgn}(x)$ gives you $\operatorname{abs}(x)$, which in turn gives $\operatorname{sgn}(x)x^2$, and so on. As Zachary Selk points out, you can make functions nicer by integrating them.

In fact, for most functions, its more likely that they can be integrated than differentiated. Not only is being "nice" a rare trait, being differentiable at all is too.

$\endgroup$
12
  • 2
    $\begingroup$ In numerical methods, integration is more accurate than differentiation, for given effort. $\endgroup$
    – Philip Roe
    Commented Oct 8, 2017 at 1:53
  • 1
    $\begingroup$ Its an aspect of niceness, and in its own way quite important. $\endgroup$
    – Philip Roe
    Commented Oct 8, 2017 at 2:02
  • 1
    $\begingroup$ Actually, that is your interpretation of nice. But mine is related. Its all about the sources of error. $\endgroup$
    – Philip Roe
    Commented Oct 8, 2017 at 2:06
  • 1
    $\begingroup$ I'm ending this discussion since I don't think you're really listening to me. The question, as framed, is about which functions are continuous and differentiable. Your point seems to be about the error terms of algorithms, which has no relevance whatsover. OP wants to know when a function will have a jump in it, like $y=\mathrm{sgn}(x)$. You're making the case that numerical integration has more rapid convergence than numerical differentiation. $\endgroup$
    – Jam
    Commented Oct 8, 2017 at 2:13
  • 2
    $\begingroup$ Actually the derivative of functions $|.| , sgn(.) $ are continuous on their domains. ! $\endgroup$
    – Red shoes
    Commented Oct 8, 2017 at 2:44
2
$\begingroup$

You can view it as a consequence of convolution being at least as nice as the involved functions. Because an integral of a function $f$ is exactly that, a convolution :

$$\int_{-\infty}^t f(\tau)d\tau = \int_{-\infty}^\infty f(\tau) H(t-\tau)d\tau = (f*H)(t)$$

Where $H(t)$ is the Heaviside step function:

$$H(t) = \cases{0 &, t < 0 \\ 0.5 &, t = 0 \\1 &, t > 0}$$

$\endgroup$
1
$\begingroup$

As others have pointed out, derivative operators generally decrease the smoothness of a function. This is true for many classes of function spaces: Sobolev, Besov, Triebel-Lizorkin, etc. In fact most of these spaces have a smoothness parameter that decreases by applying differential operators. Terrence Tao has a nice diagram illustrating the relationship between these spaces on his blog.

One thing that I did not see mentioned in the other answers is that derivatives can improve the decay of a function. So if you start from a smooth function that grows, its derivative could be considered as nicer than the original.

So I would argue that differentiation creates a trade-off between smoothness and decay. This is most clearly seen in the Fourier domain, where differentiation is basically multiplication by a polynomial. Suppose the original function is $f$ and its Fourier transform is $\widehat{f}$. In the Fourier domain, a derivative of $f$ is given by $P(\omega)\widehat{f}(\omega)$, where $P$ is some polynomial. If $\widehat{f}$ has a singularity, it can be killed by the zeros of the polynomial, thus causing the derivative of $f$ to have a reduced rate of growth. At the same time, $P$ grows at infinity, which means the derivative of $f$ will be less smooth.

$\endgroup$

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .