32
$\begingroup$

I see integrals defined as anti-derivatives but for some reason I haven't come across the reverse. Both seem equally implied by the fundamental theorem of calculus.

This emerged as a sticking point in this question.

$\endgroup$
4
  • 18
    $\begingroup$ In a certain sense this is precisely what we do when we give coordinate-independent definitions of the divergence and curl of a vector field: we can define them as the limit of flux or circulation integrals taken over small and smaller regions. You see this approach a lot in engineering and physics books. $\endgroup$ Commented Sep 12, 2019 at 21:43
  • $\begingroup$ I seem to recall there actually is an approach something like this involving absolutely continuous measures. Maybe someone can dig that up and post an answer (or correct me if I'm misremembering). $\endgroup$ Commented Sep 13, 2019 at 14:48
  • 1
    $\begingroup$ @R..: my answer below points out the role of absolute continuity here. And my comment above, about vector calculus, is related to Lebesgue's differentiation theorem, which considers the average value of an integral taken over smaller and smaller balls. $\endgroup$ Commented Sep 13, 2019 at 19:46
  • $\begingroup$ If you take a glimpse at Papa Rudin, you'll see that this is more or less, the standard/formal practice. i.e. the integral can be more rigorously defined and then something like derivative is constructed on top of it. $\endgroup$
    – polfosol
    Commented Sep 14, 2019 at 14:57

4 Answers 4

30
$\begingroup$

Let $f(x)=0$ for all real $x$.

Here is one anti-integral for $f$:

$$ g(x) = \begin{cases} x &\text{when }x\in\mathbb Z \\ 0 & \text{otherwise} \end{cases} $$ in the sense that $\int_a^b g(x)\,dx = f(b)-f(a)$ for all $a,b$.

How do you explain that the slope of $f$ at $x=5$ is not $g(5)=5$?


The idea works better if we restrict all the functions we ever look at to "sufficiently nice" ones -- for example, we could insist that everything is real analytic.

Merely looking for a continuous anti-integral wouldn't suffice to recover the usual concept of derivative, because then something like $$ x \mapsto \begin{cases} 0 & \text{when }x=0 \\ x^2\sin(1/x) & \text{otherwise} \end{cases} $$ wouldn't have a derivative on $\mathbb R$ (which is does by the usual definition).

$\endgroup$
8
  • 8
    $\begingroup$ @mjc: Indeed they are not exact inverse operations -- it's an elementary caveat that a function that has an antiderivative $F$ also have many other antiderivatives, namely $F+c$ for every constant $c$! In other words, "differentiation" is not an injective operation, so strictly speaking it cannot have an inverse operation. In the same way, a function can have several anti-(indefinite-)integrals -- this is less well known only because anti-integral is not a very common concept. $\endgroup$ Commented Sep 12, 2019 at 21:59
  • 6
    $\begingroup$ Good textbooks will phrase the FTC very carefully such that these observations doesn't invalidate what it says. It will be a good exercise to look very closely at the FTC in your favorite textbook and figure out exactly how, say, the existence of many different antiderivatives doesn't conflict with the claim. $\endgroup$ Commented Sep 12, 2019 at 22:02
  • 2
    $\begingroup$ @mjc: No. Just because for every number there is some indefinite integral that has that number as a value, doesn't mean that each indefinite integral separately takes all values. For example, take the constant function $f(x)=0$. Every constant function is an indefinite integral of this $f$, but a constant function is very far from being surjective. $\endgroup$ Commented Sep 12, 2019 at 23:43
  • 1
    $\begingroup$ "restrict to 'suitably nice' ones" - This makes one wonder something interesting: how would you define something like "real analytic" or a suitable substitute without "the real deal" derivative available ahead of time, only the integral? $\endgroup$ Commented Sep 13, 2019 at 7:07
  • 4
    $\begingroup$ @The_Sympathizer A real analytic function can locally be represented by a power series, I think that is the standard definition. It doesn't contain either integrals nor differentiation. $\endgroup$ Commented Sep 13, 2019 at 9:00
25
$\begingroup$

In a sense your question is very natural. Let's take an informal approach to it, and then see where the technicalities arise. (That's how a lot of research mathematics works, by the way! Have an intuitive idea, and then try to implement it carefully. The devil is always in the details.)

So, one way to tell the familiar story of one-variable calculus is as follows:

  1. Define the derivative $f'$ of a function $f$ as the limit of the difference quotient, $h^{-1}(f(x+h)-f(x))$, as $h\to0$.
  2. Define an anti-derivative of a function $f$ as a function $F$ for which $F'=f$.
  3. Define the definite integral of a function $f$ over $[a,b]$, say as the limit of Riemann sums.
  4. Discover that (2) and (3) are related, in the sense that $$\int_a^bf=F(b)-F(a)$$ so long as $F$ is any anti-derivative of $f$.

Now, your idea is that you can imagine doing this the other way around, as follows:

  1. Define the definite integral of a function $f$ over an interval $[a,b]$, say as a limit of Riemann sums.
  2. Define an anti-integral of a function $F$ as a function $f$ for which $$F(x)-F(0)=\int_0^xf$$
  3. Define the derivative of a function, as the limit of the difference quotient.
  4. Discover that (2) and (3) are related, in the sense that one anti-integral of $f$ is just $f'$, so long as $f'$ is defined.

The trouble in both stories arises in steps 2 and 4. In both versions, step 4 is a form of the Fundamental Theorem.

The Problem with Step 2

In both the standard and the flipped story, step 2 poses existence and uniqueness problems.

In the standard story, an anti-derivative of $f$ may not even exist; one sufficient condition is to require that $f$ be continuous, but that is not necessary. And even if you do require that $f$ be continuous, you're always going to have non-uniqueness. Thus "anti-differentiation" construed as an operation is not really a bona fide "inverse" operation, because it is not single-valued. Or in other words, differentiation is not injective: it identifies many different functions. (Exactly which functions it identifies depends on the topology of the domain they're defined on.)

In the flipped story, again note that we certainly will never have uniqueness. Given any anti-integral $f$, you can find infinitely many others by changing the values of $f$ at a set of measure zero. We also aren't guaranteed existence of an anti-integral for a given $F$, and this time not even the continuity of $F$ will serve as a sufficient condition. What we need is even stronger, "absolute continuity."

The Problem with Step 4

In the standard story, the catch is in "so long as $F$ is any anti-derivative of $f$." The problem is that not every Riemann integrable function has an anti-derivative. If we want to guarantee an anti-derivative, we can impose the additional hypothesis that $f$ is continuous (which is again sufficient but not necessary).

A similar problem arises in the flipped scenario: given an arbitrary $f$, it might not have an anti-integral. The fundamental theorem for Lebesgue integrals shows that it's both necessary and sufficient to require that $f$ be absolutely continuous, at least when we work with the Lebesgue definite integral instead of the Riemann definite integral. But given the fact that integrals are not sensitive to values on a set of measure zero, the best conclusion we can draw in that case is that an anti-integral of $f$ equals $f'$ "almost everywhere" (meaning, everywhere except at a set of measure zero).


The Upshot

Note that even in the familiar story, we don't define integrals as anti-derivatives. Thus you should not expect we could define derivatives as anti-integrals. The essential obstruction to this sort of definition is existence and uniqueness.

In both scenarios, we first specify the seemingly unrelated limit-based definitions of derivatives and definite integrals. We then discover a relationship about how anti-derivatives are related to integrals (the standard story) or how anti-integrals are related to derivatives (the flipped story), assuming enough regularity of the functions involved to resolve the existence and uniqueness problems.

$\endgroup$
1
  • $\begingroup$ Detailed and well done! $\endgroup$
    – Allawonder
    Commented Sep 13, 2019 at 21:23
5
$\begingroup$

From the point of view of analysis (as hinted at in Henning Makholm's answer) the issue is that the mapping $I:f'\to f$ is extremely not one-to-one. When you try to invert it, you find that a great deal of functions are possible "anti-integrals" of a given function. While this does occur for $d:f\to f'$ as well, there is a robust mathematical theory about how to address this and how to describe the set of anti-derivatives of a given function. For example, if $f$ is defined on $[a,b]$ then all antiderivatives of $f$ are of the form $$F_i(x)=c_i + \int_a^x f(t)dt$$ for constants $c_i$. Although in some contexts the situation becomes more complicated (for example, if we look at $1/x$ defined on $[-1,0)\cup(0,1]$ then you have two constants, one for each side) there is a whole field that studies what happens for various domains.

The situation for inverting $I$ is a lot less rosy. For one thing, if you take any finite subset of the domain you can move the function's value around however you like without changing the value. More generally, as long as two functions disagree on a set of measure zero they will have the same integral. As far as I know there are no known ways to fruitfully analyze such a set of functions (a statement that has deep repercussions in machine learning and functional analysis).

A second issue is that integrating doesn't always ensure that you can differentiate. There are a wide variety of functions $f$ such that the anti-integral doesn't (or doesn't have to) produce a differentiable function! For example, if $1_\mathbb{Q}$ denotes the function that takes on the value $1$ on rational inputs and $0$ on irrational inputs, this function has a Lebesgue integral of $0$ (a similar example works for Riemann integral but it's more work). If you take the anti-integral of $f(x)=0$ and get $1_\mathbb{Q}$, you can't differentiate and get back $f(x)=0$ because it's not differentiable.

A commenter mentions vector calculus, and it is true that something like this happens in vector calculus but there are a couple massive caveats.

$\endgroup$
2
  • $\begingroup$ Not all integrable functions have primitives. I refer to the claim in your first example. $\endgroup$
    – Allawonder
    Commented Sep 13, 2019 at 21:25
  • $\begingroup$ I downvoted. That "mapping" I is not even well defined, it's not just "extremely not one to one" (whatever that means). A mapping must associate to each input a single output, which is not the case here. This should be addressed from the beginning, I don't like the way it is written here. $\endgroup$ Commented Sep 16, 2019 at 21:15
5
$\begingroup$

Weak derivatives.

This is essentially the way one defines a weak derivative. If a function is not differentiable in the traditional sense, but it is integrable, then one may define a weaker notion of derivative through duality: the derivative of $f$ is the function $f'$ such that $$ \int f' u=-\int f u' $$ for all smooth functions $u$. One can prove that the function $f'$ is in fact $L^p$-unique. If $f$ is differentiable in the standard sense, then it is also differentiable in the weak sense, and both derivatives agree.

For example, the Dirichlet function is nowhere continuous, let alone differentiable. But its weak derivative exists, and is in fact the zero function. Indeed, $$ 0=\int 1_{\mathbb Q} u'=-\int 1'_{\mathbb Q} u $$ implies that $1'_{\mathbb Q}=0$ almost everywhere.

$\endgroup$

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .