1
$\begingroup$

Let ${X_1,X_2,\dots}$ be iid copies of an unsigned random variable ${X}$ with infinite mean, and write ${S_n := X_1 + \dots + X_n}$. Show that ${S_n/n}$ diverges to infinity in probability, in the sense that ${{\bf P}( S_n/n \geq M ) \rightarrow 1}$ as ${n \rightarrow \infty}$ for any fixed ${M < \infty}$.

Attempt: Fix some $M > 0$. Since $X$ is not absolutely integrable, it must be that ${\bf P}(X = \infty) = \delta > 0$. By assumption, $\forall 0 < \theta \leq 1$, we can choose a sufficiently large $n$ such that $\theta {\bf E}X1_{|X| \leq n} \geq \theta n \delta > M$. We write ${X_i = X_{i,\leq n} + X_{i, > n}}$ where ${X_{i, \leq n} := X_i 1_{|X_i| \leq n}}$ and ${X_{i,>n} := X_i 1_{|X_i| > n}}$, and similarly decompose ${S_n = S_{n,\leq} + S_{n,>}}$ where $\displaystyle S_{n,\leq} := X_{1,\leq n} + \dots + X_{n,\leq n}$ and $\displaystyle S_{n,>} := X_{1,>n} + \dots + X_{n,>n}$. By the iid assumption and the Paley-Zygmund inequality, we get:

$\displaystyle {\bf P}(S_{n, \leq} / n > M) \geq {\bf P}(S_{n, \leq} / n > \theta {\bf E}X1_{|X| \leq n}) = [{\bf P}(X1_{|X| \leq n} > \theta {\bf E}X1_{|X| \leq n})]^n \geq [(1 - \theta)^{2} \frac{({\bf E}X1_{|X| \leq n})^2}{{\bf E}(X1_{|X| \leq n})^2}]^n \stackrel{n}\sim (1 - \theta)^{2n}$.

As $\theta > 0$ is arbitrary, we conclude that

$\displaystyle {\bf P}(S_n/n \geq M) = {\bf P}(S_{n,\leq}/n + S_{n,>}/n \geq M) \rightarrow 1$ as desired.

Is this a valid truncation argument?

$\endgroup$
5
  • 1
    $\begingroup$ $f(x) \propto 1/x^2$ $\endgroup$
    – Andrew
    Commented Jun 5 at 0:09
  • 1
    $\begingroup$ It is not true that $X$ must be infinity with positive probability. Also, what does “unsigned “ mean? $\endgroup$
    – Michael
    Commented Jun 5 at 0:12
  • 2
    $\begingroup$ I don’t understand the purpose of $\theta$. I would assume a truncation argument would define $Y_n=min[X_n, v]$. $\endgroup$
    – Michael
    Commented Jun 5 at 0:17
  • $\begingroup$ "Since $X$ is not absolutely integrable, it must be that ${\bf P}(X = \infty) = \delta > 0$". This is entirely false. There are many examples like the Cauchy distribution. Other than that, it is often helpful to think of functions from the probability space $([0,1],\mathcal{B},\lambda)$. So take any function like $f(x)=\frac{1}{x}$ and see that it is finite for almost every $x\in[0,1]$, i.e. $\lambda\{x:f(x)=\infty\}=0$ but $\int_{0}^{1}\frac{1}{x}\,dx=\infty$. $\endgroup$ Commented Jun 7 at 15:59
  • $\begingroup$ @Michael: Correct, the non-integrability assumption only implies non-negligible tails. And the word "unsigned" simply means that $X \in [0, \infty]$. $\endgroup$
    – shark
    Commented Jun 7 at 23:12

1 Answer 1

1
$\begingroup$

Assuming that $X_n\ge 0$ and $\mathbb{E}(X_n)=+\infty$ (and absolutely no need to assume it's extended random variable, i.e. we will assume $\mathbb{P}(X_n<\infty)=1$), the argument is the following.

Let $M>0$ and $X_n^{(M)}=\min(X_n,M)$. Then $X_n^{(M)}$s are i.i.d. with finite mean (as they are in $[0,M]$). Also by bounded convergence theorem $$ \mu_M=\mathbb{E}(X_n^{(M)})\to +\infty\text{ as } M\to\infty. $$ At the same time $$ \frac1n\sum_{i=1}^n X_i\ge \frac1n\sum_{i=1}^n X_i^{(M)} $$ As $n\to\infty$, the RHS of the above converges to $\mu_M$, hence $$ \liminf_{n\to\infty}\frac1n\sum_{i=1}^n X_i\ge \liminf_{n\to\infty}\frac1n\sum_{i=1}^n X_ i^{(M)} =\lim_{n\to\infty}\frac1n\sum_{i=1}^n X_ i^{(M)}=\mu_M\text{ a.s.} $$ But the LHS does not depend on $M$, and the RHS $\to\infty$ as $M\to\infty$ so we get $$ \liminf_{n\to\infty}\frac1n\sum_{i=1}^n X_i\ge \infty. $$

$\endgroup$
1
  • 1
    $\begingroup$ I've just worked out the same solution. It's really obvious after realizing that one can make the horizontally truncated expectation as large as desired, plus the monotone convergence theorem. $\endgroup$
    – shark
    Commented Jun 7 at 23:11

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .