2
$\begingroup$

A real random variable ${X}$ is said to have a standard Cauchy distribution if it has the probability density function $\displaystyle {x \mapsto \frac{1}{\pi} \frac{1}{1+x^2}}$. If ${X_1,X_2,\dots}$ are iid copies of a random variable ${X}$ with the standard Cauchy distribution, show that $\displaystyle {\frac{|X_1|+\dots+|X_n|}{n \log n}}$ converges in probability to $\displaystyle {\frac{2}{\pi}}$.

Attempt: We use a truncation argument as posted below. Verifications and suggestions for a different approach are welcomed.

$\endgroup$

2 Answers 2

1
$\begingroup$

I'll use the following theorem, a proof of this theorem is available in Allan Gut's Probability: A Graduate Course:

Theorem 4.2 Suppose that $X, X_1, X_2, \ldots$ are independent, identically distributed random variables with partial sums $S_n, n \ge 1$. Further, let,for $x > 0, b(x) = x^{1/\rho}\ell(x)$, where $\ell \in \text{SV}$ and $\rho \in (0, 1]$. Finally, set $b_n = b(n), n \ge 1$. Then, $$ \dfrac{S_n - n \mathbb{E}[X \cdot 1\{\vert X \vert \le b_n\}]}{b_n} \stackrel{\mathbb{P}}{\rightarrow} 0 $$ if and only if $$ n\mathbb{P}(\vert X \vert > b_n) \rightarrow 0 \text{ as } n \rightarrow \infty $$

where $\text{SV}$ is a collection of slowly varying functions. You can check that, for $b(x) = x \ln x, x > 0$, $\ln x \in \text{SV}$. Now we need to check that $$ n\mathbb{P}(\vert X \vert > n \ln n) \rightarrow 0 \text{ as } n \rightarrow \infty $$ The LHS can be found in its exact form: $$ n\mathbb{P}(\vert X \vert > n \ln n) = n\left(1 - \int_{-n \ln n}^{n\ln n} \dfrac{1}{\pi (1 + x^2)}dx\right) = n\left[1 - \dfrac{2}{\pi}\text{arctan}(n \ln n)\right] \rightarrow 0 $$ The limit can be computed via L'Hospital. Thus, the theorem above implies that $$ \dfrac{S_n}{n \ln n} - \dfrac{\mathbb{E}[\vert X \vert \cdot 1\{\vert X \vert \le n \ln n\}]}{\ln n} = \dfrac{S_n - n \mathbb{E}[\vert X \vert \cdot 1\{\vert X \vert \le n \ln n\}]}{n \ln n} \stackrel{\mathbb{P}}{\rightarrow} 0 $$ where $S_n = \sum_{k = 1}^n \vert X_k \vert$. But, $$ \dfrac{\mathbb{E}[\vert X \vert \cdot 1\{\vert X \vert \le n \ln n\}]}{\ln n} = \dfrac{\ln(1 + n^2 (\ln n)^2)}{\pi \ln n} \xrightarrow{n \rightarrow \infty} \dfrac{2}{\pi} $$ Therefore, $$ \dfrac{\vert X_1 \vert + \ldots + \vert X_n \vert}{n \ln n} = \dfrac{S_n}{n \ln n} = \left[\dfrac{S_n}{n \ln n} - \dfrac{\mathbb{E}[\vert X \vert \cdot 1\{\vert X \vert \le n \ln n\}]}{\ln n}\right] + \dfrac{\mathbb{E}[\vert X \vert \cdot 1\{\vert X \vert \le n \ln n\}]}{\ln n} \xrightarrow{\mathbb{P}} \dfrac{2}{\pi} $$

$\endgroup$
1
  • $\begingroup$ This is a nice proof as well. On the other hand, is there any issue with the original solution? $\endgroup$
    – shark
    Commented Jun 21 at 2:08
1
$\begingroup$

Let $S_n := |X_1| + \dots + |X_n|$. We truncate each $|X_i|$ for $1 \leq i \leq n$ at $Cn$ for some $C$ to be chosen later, by writing $|X_i| = |X_i|_{\leq Cn} + |X_i|_{> Cn}$, where $|X_i|_{\leq Cn} := |X_i|1_{|X_i| \leq Cn}$ and $|X_i|_{> Cn} := |X_i|1_{|X_i| > Cn}$, similarly decompose $S_n = S_{n, \leq} + S_{n, >}$, where $S_{n, \leq} := |X_1|_{\leq Cn} + \dots + |X_n|_{\leq Cn}$ and $S_{n, >} := |X_1|_{> Cn} + \dots + |X_n|_{> Cn}$.

The random variable $|X|1_{|X| \leq Cn}$ can be computed to have mean

$\displaystyle {\bf E}(|X|1_{|X| \leq Cn}) = \frac{1}{\pi} \int_{\bf R} \frac{|x|1_{|x| \leq Cn}}{{1+x^2}}\ dx = \frac{1}{\pi} \int_{-Cn}^{Cn} \frac{|x|}{1 + x^2}\ dx = \log(C^2 n^2 + 1) / \pi$

and we can upper bound the variance by

$\displaystyle {\bf Var}(|X|1_{|X| \leq Cn}) \leq {\bf E}(|X|1_{|X| \leq Cn})^2 = \frac{1}{\pi} \int_{\bf R} \frac{x^21_{|x| \leq Cn}}{1+x^2}\ dx = \frac{2}{\pi}(Cn - \tan^{-1}(Cn))$

and hence $S_{n,\leq}/n$ has mean $\log(C^2 n^2 + 1)/\pi$ and variance at most $\displaystyle \frac{2/\pi \cdot (Cn - \tan^{-1}(Cn))}{n}$. By Chebyshev’s inequality, we thus have

$\displaystyle {\bf P}(|S_{n,\leq}/n - \log(C^2 n^2 + 1)/\pi| \geq \lambda) \leq \frac{2/\pi \cdot (Cn - \tan^{-1}(Cn))}{n{\lambda}^2}$

for any $\lambda > 0$.

We now turn to $S_{n,>}$. Observe that the random variable ${|X|1_{|X|>Cn}}$ is only nonzero with probability

$\displaystyle 1 - {\bf P}(|X| \leq Cn) = 1 - \frac{1}{\pi}\int_{-Cn}^{Cn} \frac{1}{1 + x^2}\ dx = 1 - \frac{2 \tan^{-1}(Cn)}{\pi}$.

Thus $S_{n,>}$ is nonzero with probability at most $\displaystyle n(1 - \frac{2 \tan^{-1}(Cn)}{\pi})$. By the triangle inequality, we conclude that

$\displaystyle {\bf P}(|S_n/n - \log(C^2 n^2 + 1)/\pi| \geq \lambda) \leq \frac{2/\pi(Cn - \tan^{-1}(Cn))}{n{\lambda}^2} + n(1 - \frac{2 \tan^{-1}(Cn)}{\pi})$

which is valid for any $\lambda > 0$. If we pick $C = \lambda = \sqrt{\log n}$, we see that

$\displaystyle {\bf P}(|S_n/n\log n - \log(n^2 \log n + 1)/\pi \log n| \geq \frac{1}{\sqrt{\log n}}) \leq O(\frac{1}{\log^{1/2} n})$

which for large $n$ implies

$\displaystyle {\bf P}(|S_n/n\log n - 2/\pi| \geq \frac{1}{\sqrt{\log n}}) \leq O(\frac{1}{\log^{1/2} n})$, giving the result.

$\endgroup$

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .