26
$\begingroup$

Possible Duplicate:
choose a random number between 0 and 1 and record its value. and keep doing it until the sum of the numbers exceeds 1. how many tries?

So I'm reading a book about simulation, and in one of the chapters about random numbers generation I found the following exercise:


For uniform $(0,1)$ random independent variables $U_1, U_2, \dots$ define

$$ N = \min \bigg \{ n : \sum_{i=1}^n U_i > 1 \bigg \} $$

Give an estimate for the value of $E[N]$.


That is: $N$ is equal to the number of random numbers uniformly distributed in $(0,1)$ that must be summed to exceed $1$. What's the expected value of $N$?

I wrote some code and I saw that the expected value of $N$ goes to $e = 2.71\dots$

The book does not ask for a formal proof of this fact, but now I'm curious!

So I would like to ask for

  • A (possibily) simple (= undergraduate level) analytic proof of this fact
  • An intuitive explanation for this fact

or both.

$\endgroup$
0

2 Answers 2

47
$\begingroup$

Here is a way to compute $\mathbb E(N)$. We begin by complicating things, namely, for every $x$ in $(0,1)$, we consider $m_x=\mathbb E(N_x)$ where $$ N_x=\min\left\{n\,;\,\sum_{k=1}^nU_k\gt x\right\}. $$ Our goal is to compute $m_1$ since $N_1=N$. Assume that $U_1=u$ for some $u$ in $(0,1)$. If $u\gt x$, then $N_x=1$. If $u\lt x$, then $N_x=1+N'$ where $N'$ is distributed like $N_{x-u}$. Hence $$ m_x=1+\int_0^xm_{x-u}\,\mathrm du=1+\int_0^xm_{u}\,\mathrm du. $$ Thus, $x\mapsto m_x$ is differentiable with $m'_x=m_x$. Since $m_0=1$, $m_x=\mathrm e^x$ for every $x\leqslant1$, in particular $\mathbb E(N)=m_1=\mathrm e$.

$\endgroup$
1
  • 2
    $\begingroup$ To provide a more general background for this idea: it is a standard technique in renewal theory, see The renewal equation. $\endgroup$
    – Roun
    Commented Jan 3, 2018 at 23:05
15
$\begingroup$

In fact it turns out that $P(N = n) = \frac{n-1}{n!}$ for $n \ge 2$. Let $S_n = \sum_{j=1}^n U_j$, and $f_n(s)$ the probability density function for $S_n$. For $0 < x < 1$ we have $f_1(x) = 1$ and $f_{n+1}(x) = \int_0^x f_n(s) \ ds$. By induction, we get $f_n(x) = x^{n-1}/(n-1)!$ for $0 < x < 1$, and thus $P(S_n < 1) = \int_0^1 f_n(s)\ ds = \dfrac{1}{n!}$. Now \begin{align*} P(N=n) &= P(S_{n-1} < 1 \le S_n)\\ &= P(S_{n-1} < 1) - P(S_n \le 1)\\ &= \frac{1}{(n-1)!} - \frac{1}{n!} \\ &= \frac{n-1}{n!} \end{align*}

$\endgroup$
3
  • $\begingroup$ isn't the distribution of the sum a bit more complex? It is called the irwin hall distribution, no? en.wikipedia.org/wiki/Irwin%E2%80%93Hall_distribution $\endgroup$
    – Dnaiel
    Commented Jan 3, 2017 at 0:14
  • 2
    $\begingroup$ The full distribution of $S_n$ is complicated, but we're only concerned with the interval $[0,1]$, where it is easy. $\endgroup$ Commented Jan 3, 2017 at 0:35
  • $\begingroup$ Get it. Thanks for the clarification $\endgroup$
    – Dnaiel
    Commented Jan 3, 2017 at 4:35

Not the answer you're looking for? Browse other questions tagged .