174
$\begingroup$

What are your favorite applications of integration by parts?

(The answers can be as lowbrow or highbrow as you wish. I'd just like to get a bunch of these in one place!)

Thanks for your contributions, in advance!

$\endgroup$
9
  • 64
    $\begingroup$ It can also be a good career move. A (likely apocryphal) story goes: when Peter Lax was awarded the National Medal of Science, the other recipients (presumably non-mathematicians) asked him what he did to deserve the Medal. Lax responded: "I integrated by parts." $\endgroup$ Commented Apr 24, 2011 at 23:42
  • 5
    $\begingroup$ Great story, Willy. $\endgroup$
    – Jon Bannon
    Commented Apr 28, 2011 at 17:40
  • 27
    $\begingroup$ Two more stories: 1. Supposedly when Laurent Schwartz received the Fields Medal (for his work on distributions, of course), someone present remarked, "So now they're giving the Fields Medal for integration by parts." 2. I believe I remember reading -- but have no idea where -- that someone once said that a really good analyst can do marvelous things using only the Cauchy-Schwarz inequality and integration by parts. I do think there's some truth to that. $\endgroup$ Commented Oct 11, 2011 at 2:15
  • 4
    $\begingroup$ More physics, but it's useful in the derivation of the Euler-Lagrange equation, which itself is very nice. $\endgroup$
    – Meow
    Commented Jan 12, 2013 at 12:31
  • 3
    $\begingroup$ @WillieWong Your comment is quoted in the book "Physics from Symmetry" books.google.de/… $\endgroup$
    – jak
    Commented Jun 20, 2015 at 11:25

20 Answers 20

156
$\begingroup$

I always liked the derivation of Taylor's formula with error term:

$$\begin{array}{rl} f(x) &= f(0) + \int_0^x f'(x-t) \,dt\\ &= f(0) + xf'(0) + \int_0^x tf''(x-t)\,dt\\ &= f(0) + xf'(0) + \frac{x^2}2f''(0) + \int_0^x \frac{t^2}2 f'''(x-t)\,dt \end{array}$$

and so on. Using the mean value theorem on the final term readily gives the Cauchy form for the remainder.

$\endgroup$
3
  • 36
    $\begingroup$ The error term in integral form is just badass. $\endgroup$
    – Pedro
    Commented Feb 23, 2012 at 17:00
  • $\begingroup$ I'm certainly missing something obvious but don't we have: $\int_0^x f'(x-t) \,dt= f(x-t) \Big |_0^x=f(x-x)-f(x-0)=f(0)-f(x)$, which is wrong by a minus sign? $\endgroup$
    – jak
    Commented Nov 13, 2014 at 11:30
  • 10
    $\begingroup$ @JakobH Note that the integration variable $t$ has a minus sign, $f(x-t)$. $\endgroup$ Commented Nov 14, 2014 at 12:13
115
$\begingroup$

My favorite this week, since I learned it just yesterday: $n$ integrations by parts produces $$ \int_0^1 \frac{(-x\log x)^n}{n!}dx = (n+1)^{-(n+1)}.$$ Then summing on $n$ yields $$\int_0^1 x^{-x}\,dx = \sum_{n=1}^\infty n^{-n}.$$

$\endgroup$
1
  • 25
    $\begingroup$ Sophomore's dream. $\endgroup$ Commented Jul 28, 2014 at 0:57
114
$\begingroup$

Let $f$ be a differentiable one-to-one function, and let $f^{-1}$ be its inverse. Then,

$$\int f(x) dx = x f(x) - \int x f'(x)dx = x f(x) - \int f^{-1}(f(x))f'(x)dx = x f(x) - \int f^{-1}(u) du \,.$$

Thus, if we know the integral of $f^{-1}$, we get the integral of $f$ for free.

BTW: This is the reason why the integrals $\int \ln(x) dx \,;\, \int \arctan(x) dx \,; ...$ are always calculated using integration by parts.

$\endgroup$
7
  • 3
    $\begingroup$ Among other things: this is one way to derive the indefinite integral of the Lambert function $W(x)$, the inverse of $x\exp\,x$. $\endgroup$ Commented Oct 11, 2011 at 8:02
  • 10
    $\begingroup$ I'd write the last integral as $\left.\int f^{-1}(u)\> du\right|_{u:=f(x)}$ or similar. $\endgroup$ Commented Oct 11, 2011 at 8:11
  • 27
    $\begingroup$ It's a very nice exercise to derive this identity geometrically, by considering both integrals as areas.... $\endgroup$ Commented Oct 11, 2011 at 8:13
  • 1
    $\begingroup$ @goblin Any result which is true for the indefinite integrals becomes trivially true when the end points are added.... Why would you enjoy more a particular case then the general one? ;) $\endgroup$
    – N. S.
    Commented Aug 6, 2015 at 13:11
  • 2
    $\begingroup$ @MrReality upload.wikimedia.org/wikipedia/commons/5/59/… $\endgroup$
    – N. S.
    Commented Oct 18, 2017 at 0:02
98
$\begingroup$

Repeated integration by parts gives $$\int_0^\infty x^n e^{-x} dx=n!$$

$\endgroup$
3
  • 62
    $\begingroup$ ...which is dual to $\sum_{n\ge0} x^n/n! = e^x$. $\endgroup$
    – Mitch
    Commented Feb 4, 2011 at 18:12
  • 6
    $\begingroup$ @Mitch, is this duality a consequence of any deeper facts? How does it generalize, if at all? $\endgroup$
    – Skatche
    Commented Apr 25, 2011 at 7:13
  • 9
    $\begingroup$ @Skatche: Excellent point, especially since I asked exactly that question immediately after I posted that comment. So see that link for discussion. $\endgroup$
    – Mitch
    Commented Apr 25, 2011 at 13:49
56
$\begingroup$

High brow: Let $f(\theta)$ be a smooth function from the circle to $\mathbb{R}$. The Fourier coefficients of $f$ are given by $a_n = 1/(2 \pi) \int f(\theta) e^{-i n \theta} d \theta$.

Integrating by parts: $$a_n = \frac{1}{n} \frac{i}{2 \pi} \int f'(\theta) e^{- i n \theta} d \theta = \frac{1}{n^2} \frac{-1}{2 \pi} \int f''(\theta) e^{- i n \theta} d \theta = \cdots$$ $$\cdots = \frac{1}{n^k} \frac{i^k}{2 \pi} \int f^{(k)}(\theta) e^{- i n \theta} d \theta = O(1/n^k)$$ for any $k$.

Thus, if $f$ is smooth, it Fourier coefficients die off faster than $1/n^k$ for any $k$. More generally, if $f$ has $k$ continuous derivatives, then $a_n = O(1/n^k)$.

$\endgroup$
0
32
$\begingroup$

As with Taylor's Theorem, the Euler-Maclaurin summation formula (with remainder) can be derived using repeated application of integration by parts.

Tom Apostol's paper "An Elementary View of Euler's Summation Formula" (American Mathematical Monthly 106 (5): 409–418, 1999) has a more in-depth discussion of this. See also Vito Lampret's "The Euler-Maclaurin and Taylor Formulas: Twin, Elementary Derivations" (Mathematics Magazine 74 (2): 109-122, 2001).

$\endgroup$
29
$\begingroup$

Perhaps not really an application, but the definition of the derivative of a distribution is based on partial integration:

if $u\in C^1(X)$ and $\phi\in C^\infty_c(X)$ is a test function, then

$\left<\partial_i u,\phi\right>=\int\phi\partial_i u=-\int u\partial_i\phi=-\left<u,\partial_i\phi\right>$ by partial integration.

Extending this, for a distribution $u$ we then define its derivative $\partial_i u$ by this formula.

$\endgroup$
1
  • 1
    $\begingroup$ I find "applications" like this intriguing. No need for any apologetic tone here. Thanks for the answer! $\endgroup$
    – Jon Bannon
    Commented Feb 5, 2011 at 20:29
29
$\begingroup$

Highbrow: Derivation of the Euler-Lagrange equations describing how a physical system evolves through time from Hamilton's Least Action Principle.

Here's a very brief summary. Consider a very simple physical system consisting of a point mass moving under the force of gravity, and suppose you know the position $q$ of the point at two times $t_0$ and $t_f$. Possible trajectories of the particle as it moved from its starting to ending point correspond to curves $q(t)$ in $\mathbb{R}^3$.

One of these curves describes the physically-correct motion, wherein the particle moves in a parabolic arc from one point to the other. Many curves completely defy the laws of physics, e.g. the point zigs and zags like a UFO as it moves from one point to the other.

Hamilton's Principle gives a criteria for determining which curve is the physically correct trajectory; it is the curve $q(t)$ satisifying the variational principle

$$\min_q \int_{t_0}^{t_f} L(q, \dot{q}) dt$$ subject to the constraints $q(t_0) = q_0, q(t_f) = q_f$. Where $L$ is a scalar-valued function known as the Lagrangian that measures the difference between the kinetic and potential energy of the system at a given moment of time. (Pedantry alert: despite being historically called the "least" action principle, really instead of minimizing we should be extremizing; ie all critical points of the above functional are physical trajectories, even those that are maxima or saddle points.)

It turns out that a curve $q$ satisfies the variational principle if and only if it is a solution to the ODE $$ \frac{d}{dt} \frac{\partial L}{\partial \dot{q}} + \frac{\partial L}{\partial q} = 0,$$ roughly equivalent to the usual Newton's Second Law $ma-F=0$, and the key step in the proof of this equivalence is integration by parts. What is remarkable here is that we started with a boundary-value problem -- given two positions, how did we get from one to the other? -- and ended with an ODE, an initial-value problem -- given an initial position and velocity, how does the point move as we advance through time?

$\endgroup$
25
$\begingroup$

My favorite example is getting an asymptotic expansion: for example, suppose we want to compute $\int_x^\infty e^{-t^2}\cos(\beta t)dt$ for large values of $x$. Integrating by parts multiple times we end up with $$ \int_x^\infty e^{-t^2}\cos(\beta t)dt \sim e^{-x^2}\sum_{k=1}^\infty(-1)^n\frac{H_{k-1}(x)}{\beta^k} \begin{cases} \cos(\beta x) & k=2n \\ \sin(\beta x) & k=2n+1 \end{cases}$$ where the Hermite polynomials are given by $H_n(x) = (-1)^ne^{x^2}\frac{d^n}{dx^n}e^{-x^2}$.

This expansion follows mechanically applying IBP multiple times and gives a nice asymptotic expansion (which is divergent as a power series).

$\endgroup$
23
$\begingroup$

Highbrow: Integration by parts can be used to compute (or verify) formal adjoints of differential operators. For instance, one can verify, and this was indeed the proof I saw, that the formal adjoint of the Dolbeault operator $\bar{\partial}$ on complex manifolds is $$\bar{\partial}^* = -* \bar{\partial} \,\,\, *, $$ where $*$ is the Hodge star operator, using integration by parts.

$\endgroup$
23
$\begingroup$

My favorite example of integration by parts (there are other nice tricks as well in this example but integration by parts starts it off) is this:

Let $I_n = \displaystyle \int_{0}^{\frac{\pi}{2}} \sin^n(x) dx$.

$I_n = \displaystyle \int_{0}^{\frac{\pi}{2}} \sin^{n-1}(x) d(-\cos(x)) = -\sin^{n-1}(x) \cos(x) |_{0}^{\frac{\pi}{2}} + \int_{0}^{\frac{\pi}{2}} (n-1) \sin^{n-2}(x) \cos^2(x) dx$

The first expression on the right hand side is zero since $\sin(0) = 0$ and $\cos(\frac{\pi}{2}) = 0$.

Now rewrite $\cos^2(x) = 1 - \sin^2(x)$ to get

$I_n = (n-1) (\displaystyle \int_{0}^{\frac{\pi}{2}} \sin^{n-2}(x) dx - \int_{0}^{\frac{\pi}{2}} \sin^{n}(x) dx) = (n-1) I_{n-2} - (n-1) I_n$.

Rearranging we get $n I_n = (n-1) I_{n-2}$, $I_n = \frac{n-1}{n}I_{n-2}$.

Using this recurrence we get $$I_{2k+1} = \frac{2k}{2k+1}\frac{2k-2}{2k-1} \cdots \frac{2}{3} I_1$$

$$I_{2k} = \frac{2k-1}{2k}\frac{2k-3}{2k-2} \cdots \frac{1}{2} I_0$$

$I_1$ and $I_0$ can be directly evaluated to be $1$ and $\frac{\pi}{2}$ respectively and hence,

$$I_{2k+1} = \frac{2k}{2k+1}\frac{2k-2}{2k-1} \cdots \frac{2}{3}$$

$$I_{2k} = \frac{2k-1}{2k}\frac{2k-3}{2k-2} \cdots \frac{1}{2} \frac{\pi}{2}$$

$\endgroup$
10
  • 1
    $\begingroup$ This is what is usually called a reduction formula $\endgroup$
    – Abel
    Commented Feb 5, 2011 at 0:12
  • 12
    $\begingroup$ This is also called Wallis formula/product I believe. $\endgroup$
    – Aryabhata
    Commented Feb 5, 2011 at 18:04
  • $\begingroup$ @Aryabhata Yes. This would've been more interesting is he showed how to get it. It's not too hard. $\endgroup$
    – Pedro
    Commented Feb 23, 2012 at 16:58
  • $\begingroup$ @PeterT.off: Are you talking about the infinite version? He did show the finite version. $\endgroup$
    – Aryabhata
    Commented Feb 23, 2012 at 17:00
  • $\begingroup$ @Aryabhata I've never see the Wallis finite product. I always seen Walli's infinite product. I guess it'd be better to at least hint what $\dfrac{I_{2k+1}}{I_{2k}}$ is, and that it tends to 1. $\endgroup$
    – Pedro
    Commented Feb 23, 2012 at 17:05
22
$\begingroup$

A lowbrow favorite of mine:

$$\int \frac{1}{x} dx = \frac{1}{x} \cdot x - \int x \cdot\left(-\frac{1}{x^2}\right) dx = 1 + \int \frac{1}{x} dx$$

Therefore, $1=0$.

A bit more highbrow, I like the use of partial integration to establish recursive formulas for integrals.

$\endgroup$
6
  • $\begingroup$ Hm, this example does not depend on integration by parts so much as it depends on not keeping track of the limits of integration. $\endgroup$ Commented Feb 4, 2011 at 16:05
  • $\begingroup$ It's true that the crux of the problem is not so much in the integration by parts, but if you integrate in a different way (what way, by the way?) you won't have that problem. $\endgroup$ Commented Feb 4, 2011 at 19:40
  • 25
    $\begingroup$ @Greg: Actually, it's not the limits of integration that matter here, but the constant of integration. $\int \frac{1}{x}\,dx$ is the entire family of antiderivatives, which is exactly the same as the family you get if you add $1$ to every member of the family. $\endgroup$ Commented Feb 4, 2011 at 22:08
  • 17
    $\begingroup$ Perhaps a more direct proof, using the same idea: $1 = \sin^2 x + \cos^2 x = \int \frac{d}{dx} (\sin^2 x + \cos^2 x)dx = \int (2 \sin x \cdot \cos x - 2 cos x \cdot \sin x)dx = \int 0 dx = 0$. $\endgroup$
    – Mike F
    Commented Apr 16, 2011 at 6:47
  • $\begingroup$ @Raskolnikov, "but if you integrate in a different way (what way, by the way?) you won't have that problem"$-$ what way(BTW)? $\endgroup$ Commented Oct 17, 2017 at 14:32
16
$\begingroup$

$\newcommand{\+}{^{\dagger}}% \newcommand{\angles}[1]{\left\langle #1 \right\rangle}% \newcommand{\braces}[1]{\left\lbrace #1 \right\rbrace}% \newcommand{\bracks}[1]{\left\lbrack #1 \right\rbrack}% \newcommand{\dd}{{\rm d}}% \newcommand{\isdiv}{\,\left.\right\vert\,}% \newcommand{\ds}[1]{\displaystyle{#1}}% \newcommand{\equalby}[1]{{#1 \atop {= \atop \vphantom{\huge A}}}}% \newcommand{\expo}[1]{\,{\rm e}^{#1}\,}% \newcommand{\floor}[1]{\,\left\lfloor #1 \right\rfloor\,}% \newcommand{\ic}{{\rm i}}% \newcommand{\imp}{\Longrightarrow}% \newcommand{\ket}[1]{\left\vert #1\right\rangle}% \newcommand{\pars}[1]{\left( #1 \right)}% \newcommand{\partiald}[3][]{\frac{\partial^{#1} #2}{\partial #3^{#1}}} \newcommand{\pp}{{\cal P}}% \newcommand{\root}[2][]{\,\sqrt[#1]{\,#2\,}\,}% \newcommand{\sech}{\,{\rm sech}}% \newcommand{\sgn}{\,{\rm sgn}}% \newcommand{\totald}[3][]{\frac{{\rm d}^{#1} #2}{{\rm d} #3^{#1}}} \newcommand{\ul}[1]{\underline{#1}}% \newcommand{\verts}[1]{\left\vert #1 \right\vert}% \newcommand{\yy}{\Longleftrightarrow}$

\begin{align}{\large% \int_{-\infty}^{\infty}{\sin^{2}\pars{x} \over x^{2}}\,\dd x} &= \left.-\,{\sin^{2}\pars{x} \over x}\right\vert_{-\infty}^{\infty} + \int_{-\infty}^{\infty}{2\sin\pars{x}\cos\pars{x} \over x}\,\dd x = \int_{-\infty}^{\infty}{\sin\pars{2x} \over x}\,\dd x \\[3mm]&={\large% \int_{-\infty}^{\infty}{\sin\pars{x} \over x}\,\dd x} \end{align}

$\endgroup$
2
  • 2
    $\begingroup$ I am wondering about the last step in the integral. How can we change the 2x to x in the argument of the sine function? Does this have to do with the infinite limits? $\endgroup$ Commented Oct 7, 2018 at 9:51
  • 2
    $\begingroup$ @Saudman97 It's equivalent to the change $\displaystyle t \equiv 2x$ such that $\displaystyle{\sin\left(2x\right) \over x}\,\mathrm{d}x$ goes over $\displaystyle{\sin\left(t\right) \over t}\,\mathrm{d}t$ and $\displaystyle x \to \pm\infty \implies t \to \pm\infty$, respectively. As $\displaystyle x$ and $\displaystyle t$ are "mute variables" you can still use $\displaystyle x$ instead of $\displaystyle t$ in the last integral. $\endgroup$ Commented Oct 7, 2018 at 20:30
10
$\begingroup$

Integrating by parts is the how one discovers the adjoint of a differential operator, and thus becomes the foundation for the marvelous spectral theory of differential operators. This has always seemed to me to be both elementary and profound at the same time.

$\endgroup$
9
$\begingroup$

This is one of many integration by-parts applications/derivation I like.

And here is one:

The Gamma Distribution

A random variable is said to have a gamma distribution with parameters ($\alpha,\lambda),~\lambda\gt 0,~\alpha\gt 0,~$ if its density function is given by the following

$$ f(x)= \begin{cases} \frac{\lambda e^{-\lambda\:x}(\lambda x)^{\alpha-1}}{\Gamma(\alpha)}~~~\text{for }~x\ge 0 \\ \\ 0 \hspace{1.09in} {\text{for }}~x\lt 0 \end{cases} $$

where $\Gamma(\alpha),$ called the gamma function is defined as

$$ \Gamma(\alpha) = \int_{0}^{\infty} \! e^{-y} y^{\alpha-1}\, \mathrm{d}y $$

Integration of $\Gamma(\alpha)$ yields the following

$$ \begin{array}{ll} \Gamma(\alpha) &=\; -e^{-y} y^{\alpha-1} \Bigg|_{0}^{\infty}~+~\int_{0}^{\infty} \! e^{-y} (\alpha-1)y^{\alpha-2}\,\mathrm{d}y \\ \\ \;&=\; (\alpha-1) \int_{0}^{\infty} \! e^{-y} y^{\alpha-2}\,\mathrm{d}y ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~(1) \\ \\ \;&=\; (\alpha-1) \Gamma(\alpha-1) \end{array} $$

For integral values of $\alpha,$ let's say, $\alpha=n,$ we will obtain, by applying Equation ($1$) repeatedly,

\[ \begin{array}{llll} \Gamma(n)&=(n-1)\Gamma(n-1) \\ &=(n-1)(n-2)\Gamma(n-2) \\ &=\ldots \\ &=(n-1)(n-2)\ldots3~\cdot~2\Gamma(1) \end{array} \]

Since $\Gamma(1)=\int_{0}^{\infty} \! e^{-x}~\mathrm{d}x=1,$ it follows that, for integral values of n,

\[ \Gamma(n)=(n-1)! \]

Hope you enjoy reading $\ldots$ :)

$\endgroup$
8
$\begingroup$

Lowbrow: $\int\sin(x)\cos(x)dx=\sin^2x-\int\sin(x)\cos(x)dx+C$.

Finding the unknown integral again after integrating by parts is an interesting case. Solving the resulting equation immediately gives the result $\int\sin(x)\cos(x)dx=\dfrac12\sin^2x$

$\endgroup$
4
  • 4
    $\begingroup$ On the other hand, spotting $\frac12\sin(2x) = \sin(x)\cos(x)$ eliminates any difficulty with integration. $\endgroup$ Commented Feb 4, 2011 at 19:49
  • 5
    $\begingroup$ ...or use the substitution $t=\sin x$. $\endgroup$ Commented Feb 4, 2011 at 20:18
  • 10
    $\begingroup$ Like the waitress said, "plus a constant!" (e.g., see: preposterousuniverse.blogspot.com/2004/07/…) $\endgroup$ Commented Feb 4, 2011 at 22:07
  • $\begingroup$ + C (that's it but I have to type more) $\endgroup$
    – GeoffDS
    Commented Apr 14, 2011 at 19:46
7
$\begingroup$

Lowbrow: $\int e^x\sin x\ dx$ and its ilk.

$\endgroup$
1
  • 17
    $\begingroup$ This can be done more efficiently using integration of complex functions: integrate $e^{x(1+i)}$, which is trivial, then take the imaginary part. $\endgroup$
    – Alex B.
    Commented Oct 11, 2011 at 0:48
7
$\begingroup$

There are a couple of applications in PDEs that I am quite fond of. As well as verifying that the Laplace operator $-\Delta$ is positive on $L^2$, I like the application of integration by parts in the energy method to prove uniqueness.

Suppose $U$ is an open, bounded and connected subset of $\mathbb{R}^n$. Introduce the BVP \begin{equation*} -\Delta u=f~\text{in}~U \end{equation*} with initial position $f$ on the boundary $\partial U$. Suppose $v\in C^2(\overline{U})$ and set $w:=u-v$ such that we can establish a homogeneous form of our equation. Then an application of integration by parts gives us \begin{equation*} 0=-\int_U w\Delta wdx=\int_U \nabla w\cdot \nabla wdx-\int_{\partial U}w\frac{\partial w}{\partial\nu}dS=\int_U|\nabla w|^2dx \end{equation*} with outward normal $\nu$ of the set $U$. By establishing that $\nabla w=0$, we can then conclude uniqueness of the solution in $U$.

$\endgroup$
0
6
$\begingroup$

Really simple but nice:

$\int \log (x) dx = \int 1 \cdot \log(x)dx = x \log(x) - \int x d(\log(x))=x (\log(x)-1) $

also:

$ \int \frac{\log^k(x)}{x}dx = \int \log^k(x)d \log(x)=\frac{\log^{k+1}(x)}{k+1} $

$\endgroup$
1
  • $\begingroup$ The "also" is very nice, not forgetting the "+c" (+1 vote) $\endgroup$ Commented Dec 19, 2020 at 9:42
1
$\begingroup$

Integration by parts shows that (modulo a constant) the Fourier transform interchanges differentiation and multiplication by the variable:

$\begin{align*} f'(x) \rightarrow \widehat{f'}(\xi) & = \int_{\mathbb{R}} f'(x)e^{-2 \pi i x \xi}dx\\ & = f(x)e^{-2 \pi i x \xi}|_{-\infty}^{\infty} - \int_{\mathbb{R}} f(x) e^{-2 \pi i x \xi} (-2 \pi i \xi) dx \\ & = (2\pi i \xi) \widehat{f}(\xi) \end{align*}$

where $f(x)e^{-2 \pi i x \xi}|_{-\infty}^{\infty}$ vanishes if $f$ decays fast.

$\endgroup$

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .