I am reading Probability Theory by A. Klenke (3rd version) and I am trying to solve this exercise:
Let T>0 and $X_1, X_2, ...$ be i.i.d. random variables that are uniformly distributed on [0,1]. Let $N:=max\{n \in \mathbb{N}_0: X_1+...+X_n\leq T\} $. Compute $E[N]$.
I found in previous questions the solution to $E[N]$ to be $e^T$ for $T\in[0,1]$ and $N$ defined as $N:= min\{n\in\mathbb{N}_0: X_1+...+X_n > T\} $. But in the excercise the set $N$ is defined differently. So, for $T \in [0,1]$ I calculated:
$E[N] = \sum_{n=0}^\infty P[N>n] = \sum_{n=0}^\infty P[X_1+...+X_{n+1}\leq T]=\sum_{n=0}^\infty \frac{T^{n+1}}{(n+1)!}=\sum_{s=1}^\infty \frac{T^{s}}{(s)!}=e^T-1$.
Is this part correct?
How can I proceed when T>1? Thank you.