77
$\begingroup$

After reading this question, the most popular answer use the identity $$\sum_{t=0}^n \binom{t}{k} = \binom{n+1}{k+1},$$ or, what is equivalent, $$\sum_{t=k}^n \binom{t}{k} = \binom{n+1}{k+1}.$$

What's the name of this identity? Is it the identity of the Pascal's triangle modified.

How can we prove it? I tried by induction, but without success. Can we also prove it algebraically?

Thanks for your help.


EDIT 01 : This identity is known as the hockey-stick identity because, on Pascal's triangle, when the addends represented in the summation and the sum itself are highlighted, a hockey-stick shape is revealed.

Hockey-stick

$\endgroup$
6
  • 9
    $\begingroup$ It is sometimes called the "hockey stick". $\endgroup$
    – user940
    Commented Oct 21, 2015 at 15:24
  • $\begingroup$ There is another cute graphical illustration on the plane of $\binom{n}{k}$ $\endgroup$ Commented Oct 21, 2015 at 16:54
  • 6
    $\begingroup$ It's pretty straightforward from the picture. Just switch the $1$ at the top of the stick with the $1$ directly below, then repeatedly replace adjacent numbers with the number in the cell below. This can be translated into a formal proof with words and symbols, but an animation or series of pictures is much more effective. $\endgroup$ Commented Oct 22, 2015 at 3:24
  • $\begingroup$ See also this question. Some post which are linked there might be of interest, too. $\endgroup$ Commented Jan 18, 2016 at 15:05
  • $\begingroup$ May I ask where this image is from? I would really like to use it in the Wikipedia article on the Hockey stick identity but of course I want to credit the source $\endgroup$ Commented Dec 19, 2019 at 19:49

20 Answers 20

48
$\begingroup$

Imagine the first $n + 1$ numbers, written in order on a piece of paper. The right hand side asks in how many ways you can pick $k+1$ of them. In how many ways can you do this?

You first pick a highest number, which you circle. Call it $s$. Next, you still have to pick $k$ numbers, each less than $s$, and there are $\binom{s - 1}{k}$ ways to do this.

Since $s$ is ranging from $1$ to $n+1$, $t:= s-1$ is ranging from $0$ to $n$ as desired.

$\endgroup$
1
  • 2
    $\begingroup$ Why did you call the highest number $s$? Why not call it $s + 1$? Isn't it easier if you call it $s+1$, because there are $\dbinom{s}{k}$ ways to pick $k$ numbers? You don't need to change the bounds or range of summation, as you do in your last para. $\endgroup$
    – user53259
    Commented Jul 11, 2021 at 20:19
33
$\begingroup$

We can use the well known identity $$1+x+\dots+x^n = \frac{x^{n+1}-1}{x-1}.$$ After substitution $x=1+t$ this becomes $$1+(1+t)+\dots+(1+t)^n=\frac{(1+t)^{n+1}-1}t.$$ Both sides of these equations are polynomials in $t$. (Notice that the RHS simplifies to $\sum_{j=1}^{n+1}\binom {n+1}j t^{j-1}$.)

If we compare coefficient of $t^{k}$ on the LHS and the RHS we see that $$\binom 0k + \binom 1k + \dots + \binom nk = \binom{n+1}{k+1}.$$


This proof is basically the same as the proof using generating functions, which was posted in other answers. However, I think it is phrased a bit differently. (And if it is formulated this way, even somebody who has never heard of generating functions can follow the proof.)

$\endgroup$
28
$\begingroup$

This is purely algebraic. First of all, since $\dbinom{t}{k} =0$ when $k>t$ we can rewrite the identity in question as $$\binom{n+1}{k+1} = \sum_{t=0}^{n} \binom{t}{k}=\sum_{t=k}^{n} \binom{t}{k}$$

Recall that (by the Pascal's Triangle), $$\binom{n}{k} = \binom{n-1}{k-1} + \binom{n-1}{k}$$

Hence $$\binom{t+1}{k+1} = \binom{t}{k} + \binom{t}{k+1} \implies \binom{t}{k} = \binom{t+1}{k+1} - \binom{t}{k+1}$$

Let's get this summed by $t$: $$\sum_{t=k}^{n} \binom{t}{k} = \sum_{t=k}^{n} \binom{t+1}{k+1} - \sum_{t=k}^{n} \binom{t}{k+1}$$

Let's factor out the last member of the first sum and the first member of the second sum: $$\sum _{t=k}^{n} \binom{t}{k} =\left( \sum_{t=k}^{n-1} \binom{t+1}{k+1} + \binom{n+1}{k+1} \right) -\left( \sum_{t=k+1}^{n} \binom{t}{k+1} + \binom{k}{k+1} \right)$$

Obviously $\dbinom{k}{k+1} = 0$, hence we get $$\sum _{t=k}^{n} \binom{t}{k} =\binom{n+1}{k+1} +\sum_{t=k}^{n-1} \binom{t+1}{k+1} -\sum_{t=k+1}^{n} \binom{t}{k+1}$$

Let's introduce $t'=t-1$, then if $t=k+1 \dots n, t'=k \dots n-1$, hence $$\sum_{t=k}^{n} \binom{t}{k} = \binom{n+1}{k+1} +\sum_{t=k}^{n-1} \binom{t+1}{k+1} -\sum_{t'=k}^{n-1} \binom{t'+1}{k+1}$$

The latter two arguments eliminate each other and you get the desired formulation $$\binom{n+1}{k+1} = \sum_{t=k}^{n} \binom{t}{k} = \sum_{t=0}^{n} \binom{t}{k}$$

$\endgroup$
0
24
$\begingroup$

$$\begin{align} \sum_{t=\color{blue}0}^n \binom{t}{k} =\sum_{t=\color{blue}k}^n\binom tk&= \sum_{t=k}^n\left[ \binom {t+1}{k+1}-\binom {t}{k+1}\right]\\ &=\sum_{t=\color{orange}k}^\color{orange}n\binom {\color{orange}{t+1}}{k+1}-\sum_{t=k}^n\binom t{k+1}\\ &=\sum_{t=\color{orange}{k+1}}^{\color{orange}{n+1}}\binom {\color{orange}{t}}{k+1}-\sum_{t=k}^n\binom t{k+1}\\ &=\binom{n+1}{k+1}-\underbrace{\binom k{k+1}}_0&&\text{by telescoping}\\ &=\binom{n+1}{k+1}\quad\blacksquare\\ \end{align}$$

$\endgroup$
19
$\begingroup$

You can use induction on $n$, observing that

$$ \sum_{t=0}^{n+1} \binom{t}{k} = \sum_{t=0}^{n} \binom{t}{k} + \binom{n+1}{k} = \binom{n+1}{k+1} + \binom{n+1}{k} = \binom{n+2}{k+1} $$

$\endgroup$
5
  • $\begingroup$ How can you say that $\sum_{t=0}^n \binom{t}{k} = \binom{n+1}{k+1}$ in your proof. $\endgroup$
    – hlapointe
    Commented Oct 21, 2015 at 15:13
  • 4
    $\begingroup$ That's the inductive hypothesis. $\endgroup$ Commented Oct 21, 2015 at 15:14
  • $\begingroup$ Ok. Can we prove it algebraically? $\endgroup$
    – hlapointe
    Commented Oct 21, 2015 at 15:15
  • $\begingroup$ What's the first step!? Because if I take $n=1$, the hypothesis seem to be incorrect. $\endgroup$
    – hlapointe
    Commented Oct 21, 2015 at 15:21
  • $\begingroup$ @hlapointe One choice of base case for every fixed $k$ is that $\sum_{t=0}^{k} \binom{t}{k} = \binom{k}{k} = 1 = \binom{k+1}{k+1}$. $\endgroup$ Commented Oct 21, 2015 at 16:28
14
$\begingroup$

Another technique is to use snake oil. Call your sum:

$\begin{align} S_k &= \sum_{0 \le t \le n} \binom{t}{k} \end{align}$

Define the generating function:

$\begin{align} S(z) &= \sum_{k \ge 0} S_k z^k \\ &= \sum_{k \ge 0} z^k \sum_{0 \le t \le n} \binom{t}{k} \\ &= \sum_{0 \le t \le n} \sum_{k \ge 0} \binom{t}{k} z^k \\ &= \sum_{0 \le t \le n} (1 + z)^t \\ &= \frac{(1 + z)^{n + 1} - 1}{(1 + z) - 1} \\ &= z^{-1} \left( (1 + z)^{n + 1} - 1 \right) \end{align}$

So we are interested in the coefficient of $z^k$ of this:

$\begin{align} [z^k] z^{-1} \left( (1 + z)^{n + 1} - 1 \right) &= [z^{k + 1}] \left( (1 + z)^{n + 1} - 1 \right) \\ &= \binom{n + 1}{k + 1} \end{align}$

$\endgroup$
13
$\begingroup$

The RHS is the number of $k+1$ subsets of $\{1,2,...,n+1\}$. Group them according to the largest element in the subset. Sum up all the cases. Get the LHS.

$\endgroup$
12
$\begingroup$

We can use the integral representation of the binomial coefficient $$\dbinom{t}{k}=\frac{1}{2\pi i}\oint_{\left|z\right|=1}\frac{\left(1+z\right)^{t}}{z^{k+1}}dz\tag{1} $$ and get $$\sum_{t=0}^{n}\dbinom{t}{k}=\frac{1}{2\pi i}\oint_{\left|z\right|=1} \frac{\sum_{t = 0}^{n}\left(1+z\right)^{t}}{z^{k+1}}dz $$ $$=\frac{1}{2\pi i}\oint_{\left|z\right|=1}\frac{\left(z+1\right)^{n+1}}{z^{k+2}}dz-\frac{1}{2\pi i}\oint_{\left|z\right|=1}\frac{1}{z^{k+2}}dz $$ and so using $(1)$ again we have $$\sum_{t=0}^{n}\dbinom{t}{k}=\dbinom{n+1}{k+1}-0=\color{red}{\dbinom{n+1}{k+1}.}$$

$\endgroup$
2
  • 4
    $\begingroup$ It is so nice and weird. +1 $\endgroup$ Commented Jul 5, 2016 at 10:27
  • $\begingroup$ +1. Nice work. You must subtract $\displaystyle{\delta_{k,-1}}$ in order to take account of the case $\displaystyle{k = -1}$. When $\displaystyle{k = -1}$, the LHS is equal to $\displaystyle{0}$ and your RHS is equal to $\displaystyle{1}$. With the $\displaystyle{\delta_{k,-1}}$ you'll get $\displaystyle{1 - 1 = 0}$. $\endgroup$ Commented Jul 6, 2016 at 21:50
7
$\begingroup$

A Generalization

In this answer, I prove the identity $$ \binom{-n}{k}=(-1)^k\binom{n+k-1}{k}\tag{1} $$ Here is a generalization of the identity in question, proven using the Vandermonde Identity $$ \begin{align} \sum_{t=0}^n\binom{t}{k}\binom{n-t}{j} &=\sum_{t=0}^n\binom{t}{t-k}\binom{n-t}{n-t-j}\tag{2}\\ &=\sum_{t=0}^n(-1)^{t-k}\binom{-k-1}{t-k}(-1)^{n-t-j}\binom{-j-1}{n-t-j}\tag{3}\\ &=(-1)^{n-j-k}\sum_{t=0}^n\binom{-k-1}{t-k}\binom{-j-1}{n-t-j}\tag{4}\\ &=(-1)^{n-j-k}\binom{-k-j-2}{n-j-k}\tag{5}\\ &=\binom{n+1}{n-j-k}\tag{6}\\ &=\binom{n+1}{j+k+1}\tag{7} \end{align} $$ Explanation:
$(2)$: $\binom{n}{k}=\binom{n}{n-k}$
$(3)$: apply $(1)$ to each binomial coefficient
$(4)$: combine the powers of $-1$ which can then be pulled out front
$(5)$: apply Vandermonde
$(6)$: apply $(1)$
$(7)$: $\binom{n}{k}=\binom{n}{n-k}$

To get the identity in the question, set $j=0$.


A Simpler Proof of the Basic Formula $$ \begin{align} \sum_{k=0}^n\color{#C00}{\binom{k}{m}} &=\sum_{k=0}^n\color{#C00}{\left[x^m\right](1+x)^k}\tag8\\ &=\left[x^m\right]\frac{(1+x)^{n+1}-1}{(1+x)-1}\tag9\\[6pt] &=\left[x^{m+1}\right](1+x)^{n+1}-1\tag{10}\\[6pt] &=\binom{n+1}{m+1}\tag{11} \end{align} $$ Explanation:
$\phantom{1}\text{(8)}$: use the definition of the binomial coefficient
$\phantom{1}\text{(9)}$: sum the finite geometric series
$(10)$: multiply by $x$ and adjust the exponent of $x$
$(11)$: extract the coefficient using the binomial theorem

$\endgroup$
13
  • 2
    $\begingroup$ @FoF: I have added a link here and answered your other question. Thanks for mentioning the difficulty. $\endgroup$
    – robjohn
    Commented Dec 7, 2013 at 12:33
  • 1
    $\begingroup$ @FoF: That is the Vandermonde Identity that I mentioned at the beginning. $\endgroup$
    – robjohn
    Commented Dec 8, 2013 at 18:56
  • 1
    $\begingroup$ @FoF: I added an explanation for each line. $\endgroup$
    – robjohn
    Commented Dec 9, 2013 at 2:20
  • 1
    $\begingroup$ I answered my own question about $(5, 6$) here. $\endgroup$
    – NaN
    Commented Dec 10, 2013 at 8:54
  • 1
    $\begingroup$ @FoF: Ah. That is why I added the Explanation when I saw difficulty in following the argument. $\endgroup$
    – robjohn
    Commented Dec 11, 2013 at 7:46
5
$\begingroup$

You remember that: $$ (1+x)^m = \sum_k \binom{m}{k} x^k $$ So the sum $$ \sum_{m=0}^M \binom{m+k}{k} $$ is the coefficient of $ x^k $ in: $$ \sum_{m=0}^M (1+x)^{m+k} $$ Yes? So now use the geometric series formula given: $$ \sum_{m=0}^M (1+x)^{m+k} = -\frac{(1+x)^k}{x} \left( 1 - (1+x)^{M+1} \right) $$ And now you want to know what is coefficient of $x^k $ in there. You got it from here.

$\endgroup$
3
$\begingroup$

Recall that for $k\in\Bbb N$ we have the generating function

$$\sum_{n\ge 0}\binom{n+k}kx^n=\frac1{(1-x)^{k+1}}\;.$$

The identity in the question can therefore be rewritten as

$$\left(\sum_{n\ge 0}\binom{n+k}kx^n\right)\left(\sum_{n\ge 0}x^n\right)=\sum_{n\ge 0}\binom{n+k+1}{k+1}x^n\;.$$

The coefficient of $x^n$ in the product on the left is

$$\sum_{i=0}^n\binom{i+k}k\cdot1=\sum_{i=0}^n\binom{i+k}k\;,$$

and the $n$-th term of the discrete convolution of the sequences $\left\langle\binom{n+k}k:n\in\Bbb N\right\rangle$ and $\langle 1,1,1,\dots\rangle$. And at this point you’re practically done.

$\endgroup$
14
  • $\begingroup$ Is there a typo in the second equation (first sum)? I believe $k$ should be indexed. $\endgroup$ Commented May 27, 2013 at 6:20
  • $\begingroup$ @Alan: No, the sum is over $n$; $k$ is fixed throughout. $\endgroup$ Commented May 27, 2013 at 7:19
  • $\begingroup$ In my text, I have an identity $\sum_{r\geq 0} \binom{r + n}{r} x^r = 1/(1-x)^{n+1}$ This may be the cause of my confusion, but is this identity correct and is it equivalent to the one you used? $\endgroup$ Commented May 27, 2013 at 8:22
  • $\begingroup$ @Alan: Sure: your $r$ is my $n$, and your $n$ is my $k$. $\endgroup$ Commented May 27, 2013 at 8:28
  • 1
    $\begingroup$ @Alan: $\binom{r+n}r=\binom{r+n}n$; now do the translation. (Sorry: I didn’t notice before that you’d used the symmetrically opposite binomial coefficient.) $\endgroup$ Commented May 27, 2013 at 19:19
2
$\begingroup$

We can prove this by counting in two ways.

Let $S$ be the set of all $(k+1)$-element subsets of $[n+1]$. By definition, $|S|=\binom{n+1}{k+1}$.

Let $S_i$ be the set of all $(k+1)$-element subsets of $[n+1]$ such that the largest element is $i+1$. Picking $k+1$ elements from $[n+1]$ such that the largest element is $i+1$ is a two-step-process.

(Step 1) Pick $i+1$. The number of way(s) to do this is $\binom{1}{1}$.

(Step 2) Pick the $k$ elements from the the remaining $i$ elements. The number of way(s) to do this is $\binom{i}{k}$.

Therefore, $|S_i|=\binom{1}{1}\binom{i}{k}=\binom{i}{k}$. Since we can see that $S_k, S_{k+1}, S_{k+2}, \dots, S_n$ partition $S$, we have that \begin{gather*} \sum_{i=k}^n|S_i|=|S|\\ \sum_{i=k}^n\binom{i}{k}=\binom{n+1}{k+1} \end{gather*} Since we know that if $i < k$, then $\binom{i}{k}=0$, we can say that $\sum_{i=k}^n\binom{i}{k}=\sum_{i=0}^n\binom{i}{k}$. Therefore, we have \begin{gather*} \sum_{i=0}^n \binom{i}{k} = \binom{n+1}{k+1} \end{gather*}

$\endgroup$
0
2
$\begingroup$

A standard technique to prove such identities $\sum_{i=0}^Mf(i)=F(M)$, involving on one hand a sum where only the upper bound $M$ is variable and on the other hand an explicit expression in terms of$~M$, is to use induction on$~M$. It amounts to showing that $f(M)=F(M)-F(M-1)$ (and that $F(0)=f(0)$). This is similar to using the fundamental theorem of calculus in showing that $\int_0^{x_0}f(x)\mathrm dx=F(x_0)$ by establishing $f(x)=F'(x)$ (and $F(0)=0$).

So here you need to check (apart from the obvious starting case $M=0$) that $\binom{M+k}k=\binom{M+k+1}{k+1}-\binom{M+k}{k+1}$. This is just in instance of Pascal's recurrence for binomial coefficients.

Added remark this identity (not its name) is very old. It is one of the first "conséquences" that Pascal gives in his "Traité du Triangle arithmétique" after defining this triangle by means of (what is now called) Pascal's recursion. Indeed, it is either the "conséquence seconde" or the "conséquence troisième", depending on how one identifies the Triangle arithmétique which is a rectangular table with modern depictions of the triangle: if one has the "columns" (rangs perpendiculaires) correspond to sets of $\binom nk$ with $k$ fixed, while "rows" (rangs parallèles) correspond to sets of $\binom nk$ with $n-k$ fixed (this is geometrically most natural, basically a rotation by $-\frac\pi4$), then it is the "conséquence troisième", but if one respects the combinatorial interpretation Pascal gives (much later) in Proposition II, then identification differs by a symmetry of the triangle, and one gets the "conséquence seconde", which talks about sums along rows rather than columns. (For comparison, the "conséquence première" that every entry on the border of the triangle equals$~1$.)

CONSÉQUENCE TROISIÈME. En tout Triangle arithmétique, chaque cellule égale la somme de toutes celles du rang perpendiculaire précédent, comprises depuis son rang parallèle jusqu'au premier inclusivement.

Loosely translated: in every Pascal's triangle, each entry equals the sum of those of the previous column, from that of its (own) row up to the first (row) inclusive.

$\endgroup$
2
$\begingroup$

The terms $\binom tk$ count the ways to distribute $t-k$ balls over $k+1$ bins, and we want to show that they sum to $\binom{n+1}{k+1}$, the number of ways to distribute $n-k$ balls over $k+2$ bins. Designate one of the $k+2$ bins as special and enumerate the ways to distribute $n-k$ balls over the $k+2$ bins according to the number $n-t$ of balls placed in the designated bin, with the remaining $t-k$ balls distributed over the remaining $k+1$ bins.

$\endgroup$
2
$\begingroup$

$\newcommand{\angles}[1]{\left\langle\,{#1}\,\right\rangle} \newcommand{\braces}[1]{\left\lbrace\,{#1}\,\right\rbrace} \newcommand{\bracks}[1]{\left\lbrack\,{#1}\,\right\rbrack} \newcommand{\dd}{\mathrm{d}} \newcommand{\ds}[1]{\displaystyle{#1}} \newcommand{\expo}[1]{\,\mathrm{e}^{#1}\,} \newcommand{\half}{{1 \over 2}} \newcommand{\ic}{\mathrm{i}} \newcommand{\iff}{\Leftrightarrow} \newcommand{\imp}{\Longrightarrow} \newcommand{\ol}[1]{\overline{#1}} \newcommand{\pars}[1]{\left(\,{#1}\,\right)} \newcommand{\partiald}[3][]{\frac{\partial^{#1} #2}{\partial #3^{#1}}} \newcommand{\root}[2][]{\,\sqrt[#1]{\,{#2}\,}\,} \newcommand{\totald}[3][]{\frac{\mathrm{d}^{#1} #2}{\mathrm{d} #3^{#1}}} \newcommand{\verts}[1]{\left\vert\,{#1}\,\right\vert}$ Assuming $\ds{M \geq 0}$:

\begin{align} & \mbox{Note that} \\[2mm] &\ \sum_{m = 0}^{M}{m + k \choose k} = \sum_{m = k}^{M + k}{m \choose k} = a_{M + k} - a_{k - 1}\quad\mbox{where}\quad a_{n} \equiv \sum_{m = 0}^{n}{m \choose k}\tag{1} \end{align}


Then, \begin{align} \color{#f00}{a_{n}} & \equiv \sum_{m = 0}^{n}{m \choose k} = \sum_{m = 0}^{n}\ \overbrace{% \oint_{\verts{z} = 1}{\pars{1 + z}^{m} \over z^{k + 1}}\,{\dd z \over 2\pi\ic}} ^{\ds{m \choose k}}\ =\ \oint_{\verts{z} = 1}{1 \over z^{k + 1}}\sum_{m = 0}^{n}\pars{1 + z}^{m} \,{\dd z \over 2\pi\ic} \\[3mm] & = \oint_{\verts{z} = 1}{1 \over z^{k + 1}}\, {\pars{1 + z}^{n + 1} - 1 \over \pars{1 + z} - 1}\,{\dd z \over 2\pi\ic} \\[3mm] & = \underbrace{\oint_{\verts{z} = 1}{\pars{1 + z}^{n + 1} \over z^{k + 2}} \,{\dd z \over 2\pi\ic}}_{\ds{n + 1 \choose k + 1}}\ -\ \underbrace{\oint_{\verts{z} = 1}{1 \over z^{k + 2}}\,{\dd z \over 2\pi\ic}} _{\ds{\delta_{k + 2,1}}} \\[8mm] \imp\ \color{#f00}{a_{n}} & = \fbox{$\ds{\quad% {n + 1 \choose k + 1} - \delta_{k,-1}\quad}$} \end{align}
\begin{align} \mbox{With}\ \pars{1}\,,\quad \color{#f00}{\sum_{m = 0}^{M}{m + k \choose k}} & = \bracks{{M + k + 1 \choose k + 1} - \delta_{k,-1}} - \bracks{{k \choose k + 1} - \delta_{k,-1}} \\[3mm] & = {M + k + 1 \choose k + 1} - {k \choose k + 1} \end{align} Thanks to $\ds{@robjohn}$ user who pointed out the following feature: $$ {k \choose k + 1} = {-k + k + 1 - 1 \choose k + 1}\pars{-1}^{k + 1} = -\pars{-1}^{k}{0 \choose k + 1} = \delta_{k,-1} $$ such that $$ \begin{array}{|c|}\hline\mbox{}\\ \ds{\quad\color{#f00}{\sum_{m = 0}^{M}{m + k \choose k}} = \color{#f00}{{M + k + 1 \choose k + 1} - \delta_{k,-1}}\quad} \\ \mbox{}\\ \hline \end{array}\\ $$
$\endgroup$
3
  • $\begingroup$ Since $k=-1$ is covered in the first part, it should be noted that since $\binom{-1}{0}=1$, $$\binom{k}{k+1}-\delta_{k,-1}=0$$ therefore the final answer seems it should be $$\binom{M+k+1}{k+1}-\delta_{k,-1}$$ $\endgroup$
    – robjohn
    Commented Jul 25, 2016 at 13:00
  • $\begingroup$ @robjohn Thanks. I'm checking everything right now. $\endgroup$ Commented Jul 25, 2016 at 21:48
  • $\begingroup$ @robjohn Thanks. Fixed. $\endgroup$ Commented Jul 25, 2016 at 22:09
1
$\begingroup$

This is essentially the same as the induction answer already mentioned, but it brings a pictorial perspective so I thought to add it as an alternative answer here.

Here's a restatement of the identity (which you can verify to be equivalent easily): On Pascal's triangle, start from a number (one of the $1$s) on the left edge and move diagonally rightwards and downwards, summing the terms as we go along. We can decide to stop at any point, and the sum of all these terms will be the number directly below and to the left of the final summand.

This actually trivialises the proof of the identity. Note that if we decided to add one more term to the sum (the term to the right of the current sum), on the one hand this "lengthens" the stick by $1$ tile, but on the other hand it adds the term adjacent to the sum---which by definition of Pascal's triangle, produces the number in the tile directly below and horizontally in between the sum and this new term. This can be rigorously translated to the inductive step in a formal induction proof.

Hockey-stick

To illustrate, let's refer to the picture in the question, and focus on the yellow hexagonal tiles. (Note that this is a reflected case of what I described above since it starts from the right edge, but this doesn't affect the discussion.) Currently, we have $1+6+21+56=84$, which is a true identity. If I added $126$ to the LHS, I would also be adding $126$ to the RHS, which by definition gives us $210$, the term below and in between them on Pascal's triangle. Once you really convince yourself of the validity of this argument, the (formal) proof of the identity should come naturally!

$\endgroup$
1
  • $\begingroup$ Your answer emphasises the recursive nature of the solution. +1. But, you never seemed to hint towards the term of recursion in your answer! Answer at: math.stackexchange.com/a/4009354/424260 , states recursion, but using arithmetic. $\endgroup$
    – jiten
    Commented Feb 3, 2021 at 1:30
0
$\begingroup$

Let me attempt a “story” solution:

Let’s say you are trying to find ways to select $k+1$ integers from the first $n+1$ integers.

We consider all the cases, from where the largest number is $k+1$, up to where the largest number is $n+1$.

It is evident that if the largest number is $k+1$, that there is only one possibility, which is selecting every number from $1$ to $k+1$, and we can write this as $\binom kk$.

More generally, if the largest number is $j\geq k+1$, there are $\binom{j-1}{k}$ selections.

We then sum all of these up, all the way to where the largest number is $k+1$ itself.

$\endgroup$
1
0
$\begingroup$

First proof

Using stars and bars, the number of ways to put $n$ identical objects to $k$ bins(empty bin allowed) is $\binom{n+k-1}{k-1}$.

If we reduce the number of bins by one, the number of ways to put $n$ identical objects to $k-1$ bins is $\binom{n+k-2}{k-2}$.

We can also count the number of ways to put $n$ identical objects to $k$ bins using the answer for $k-1$ bins. Split them into different cases based on how many objects you put in the first bin. \begin{array} {|r|r|}\hline \text{Number of objects for first bin} & \text{Number of objects remaining for other bins} & \text{Number of ways to distribute} \\ \hline 0 & n & \binom{n+k-2}{k-2} \\ \hline 1 & n-1 & \binom{n+k-3}{k-2} \\ \hline 2 & n-2 & \binom{n+k-4}{k-2} \\ \hline \vdots & \vdots & \vdots \\ \hline n-2 & 2 & \binom{k}{k-2} \\ \hline n-1 & 1 & \binom{k-1}{k-2} \\ \hline n & 0 & \binom{k-2}{k-2} \\ \hline \end{array}

Therefore, $$\binom{n+k-1}{k-1} = \binom{k-2}{k-2} + \binom{k-1}{k-2} + \binom{k}{k-2} +\dots+ \binom{n+k-4}{k-2} + \binom{n+k-3}{k-2} + \binom{n+k-2}{k-2}$$

Rename the variables: Let $m+1 = n+k-1$ and $r+1 = k-1$. We get: $$\binom{m+1}{r+1} = \binom{r}{r} + \binom{r+1}{r} + \binom{r+2}{r} +\dots+ \binom{m-2}{r} + \binom{m-1}{r} + \binom{m}{r}$$

This is the identity when expanded.

Second proof

Suppose we have $m+1$ people where everyone has different age and we want to select $r+1$ people to be in some committee. There are $$\binom{m+1}{r+1} $$ ways to form the committee.

Alternatively, we can count this by splitting the committee based on who the oldest person in the committee is.

We can have the oldest person to be the oldest in the committee. Then there are $m$ people left and we still need to add $r$ people to be in the committee. So there are $\binom{m}{r}$ ways.

We can have the second oldest person to be the oldest in the committee. Then there are $m-1$ people left since we must exclude the first oldest person and second oldest(already selected) and we still need to add $r$ people to be in the committee. So there are $\binom{m-1}{r}$ ways.

In general, suppose we have the kth oldest person to be the oldest in the committee. Then there are $m-k+1$ possible people left to add(excluding kth oldest) since we must exclude the oldest members up to the $k$th oldest person. We still need to add $r$ people to be in the committee. So there are ${m-k+1 \choose r}$ ways.

In each case, there needs to be at least $r$ people left to choose so the last case is when we select the $(m+1-r)$th oldest person. This will give us ${r \choose r}$ ways.

So $$\binom{m+1}{r+1} = \binom{r}{r} + \binom{r+1}{r} + \binom{r+2}{r} +\dots+ \binom{m-2}{r} + \binom{m-1}{r} + \binom{m}{r}$$

$\endgroup$
0
$\begingroup$

Here is what I hope is an intuitive proof with few numbers and no diagrams:

(N choose K)

If one imagines all elements of the set in fixed order in a linear array with the first combination being the first K elements then the question is, how do we continue to generate further combinations?

One way is to just slide the rightmost element pointer one element to the right. This is a new combo and then if we continue to obtain combos of size K-1 and each of these combos composed with the element pointed to by the rightmost pointer -- the composed combo is of course of the desired size, K.

We continue the process of incrementing the rightmost pointer by one and each time we do, the K-1 combo has a larger (by 1) subset of the N elements to choose from and we can slide the rightmost pointer N-K times.

Just to give a concrete example, let's try 5 choose 3 from the set ABCDE. First combo is ABC, then slide pointer so next combo is ABD. We get further combos from get the 2-combos of ABC of which there are 3 composed with D.

So first combo count before slide is 3 choose 3 or 1. Next combos are D composed with 3 choose 2 which is 3

When the rightmost pointer is pointing at E then we are choosing 2-combos from ABCD -- (4 choose 2) or 6.

Summing we get 1 + 3 + 6 = 10 which is (5 choose 3).

I hope the above is both correct and easier to understand than the other proofs. I do realize I may simply be restating other proofs but the idea that composing combos with another element to get further combos I do not think is emphasized in other proofs.

To be a real proof, I guess we have to demonstrate that there are no duplicate combos generated in this way and every combo is in fact generated, but I think this "feels" pretty obvious.

$\endgroup$
0
$\begingroup$

Let me add one purely-combinatorial proof. [edit]: the justification for doing so is that I think we can tell this in a "committee-forming" way that is used for other identities (e.g. Pascal's rule), without needing to think about picking integers (this always tends to confuse me, as a novice, because it isn't always clear to what extent it matters that the items being picked don't merely have quantity but are pure quantity). Here's an example with characters from Scooby Doo! that I think a bright seventh-grader could understand.

Imagine that the Mystery Inc. gang (four people plus a dog) wants to split up. One group will look for clues; the other will stay behind. They want to have a group of three look for clues, so we want to find $\binom{5}{3}$. Suppose we designate a subgroup of two of them, Shaggy and Scooby, to be unreliable when looking for clues alone. So, we think about the process of picking the clues group as ensuring that they have a "safety buddy". We start with Daphne and count all groups involving her and two other people, whether or not this includes Shaggy and/or Scooby: this is $\binom{4}{2}$. It is worth briefly explaining that this holds generally: if we have $n$ people and want to pick a group of $k$ that definitely includes person $i$, then we must count $\binom{n-1}{k-1}$ because we have one fewer person to choose from ($n-1$) but we also only need $k-1$ people because we know that person $i$ will complete the group.

Then, we do the same for all groups involving Fred. Daphne is already spoken for, so this is $\binom{3}{2}$, by a similar argument: we now omit two people from consideration (Daphne: already counted; Fred: must be involved, so we disregard him). Finally, Velma can be involved in only one way that we haven't already counted, which is is her serving as Scooby and Shaggy's safety buddy. This leaves only $\binom{2}{2} = 1$ group of counterparts for her.

Generalizing what we've done, we had $k-1$ elements of a group we designated inessential and $n+(k-1)$ designated essential. We wanted to count all groups with at least one (but maybe more) essential element, so all groups of size $k$ by definition (because a group of size $k-1$ is the largest group that might possibly include only inessential members, so a group one bigger is the smallest such group). Then, we simply held out, successively, each essential element and counted the groups that could be formed: each time, we had $1, 2, ... n-(k-1)$ fewer than $n$ elements and needed to pick $k-1$ people to complement the element we were holding out. When we used up all essential elements, no more groups could be formed because the number of inessential elements is smaller than the size of the group (by construction). In formulae, we found that $\binom{n}{k} = \sum_{j=0}^{n-k} \binom{n-j-1}{k-1}$. (Note, by the way, that we did not use, here, the other interpretation of Pascal's rule, that going "up and to the right" means ensuring that one person is not in a group of the same size; there is a way to form this interpretation, but it is less direct). In other words, once we start on the "blade", we move up and to the right because we are reducing our population size by one each time and thereby ensuring that multiple successive people are included in the group.

$\endgroup$

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .