14
$\begingroup$

Let $X_1, X_2, \ldots$ be an infinite sequence of sub-Gaussian random variables which are not necessarily independent.

My question is how to prove \begin{eqnarray} \mathbb{E}\max_i \frac{|X_i|}{\sqrt{1+\log i}} \leq C K, \end{eqnarray} where $K=\max_i \|X_i\|_{\psi_2}$. Note that $\|\cdot\|_{\psi_2}$ is the Orlicz norm for sub-Gaussian random variable.

Here is my thought that confuses me.... Consider the finite case with $i\leq N$, we have \begin{eqnarray} \mathbb{E}\max_{i\leq N} \frac{|X_i|}{\sqrt{1+\log i}} &=& \int_0^\infty \mathbb{P}\left(\max_{i\leq N} \frac{|X_i|}{\sqrt{1+\log i}} > t \right) dt \\ &\leq& \int_0^\infty \sum_{i=1}^N\mathbb{P}\left( \frac{|X_i|}{\sqrt{1+\log i}} > t \right) dt \\ &\leq& \sum_{i=1}^N \frac{2}{\sqrt{1+\log i}} \int_0^\infty e^{-cs^2/K^2}ds \\ &=& K\sqrt{\frac{\pi}{c}} \sum_{i=1}^N \frac{1}{\sqrt{1+\log i}} \end{eqnarray} where the first inequality holds by a simple union bound and the second inequality holds by sub-Gaussianity of $X_i$ (i.e. we have $\mathbb{P}\{|X_i|\geq t\} \leq 2 e^{-ct^2/\|X_i\|_{\psi_2}^2}$ and $c$ is an absolute constant) and a simple trick of change-of-variable (i.e. let $s := t\sqrt{1+\log i}$).

However, the problem of my proof above is that the sum $\sum_{i=1}^N \frac{1}{\sqrt{1+\log i}}\to\infty$ as $N\to\infty$. Intuitively, I think the inequalities I used here are not very sharp. But what is the right inequality to use in this case???

This question comes from Exercise 2.5.10 of Prof. Roman Vershynin's book titled as "High-Dimensional Probability". The electric version of this book is downloadable from his personal webpage.

$\endgroup$

2 Answers 2

12
$\begingroup$

Based on the answer from Behrad Moniri, I will fill the last steps with explicit $c$ and $K$, which is really just noting that $1 = K/c$ since Behrad Moniri assumed $K=c$. The key, as besmel's answer pointed out, is to truncate the integral, which we will cut at $t_0 = \sqrt{\frac{2}{c}} K$. In theory, any point beyond $K / \sqrt{c}$ would do. Write $Y_i = |X_i| / \sqrt{1 + \log i} \ge 0$,

$$ \begin{align} \int_{\mathbb{R}} \mathbb{P}(\max_i Y_i >t) &\le \int_0^{t_0} \mathbb{P}(\max_i Y_i > t) dt + \int_{t_0}^{\infty}\sum_i 2 \exp\left\{ - \frac{c}{K^2} t^2 (1 + \log i)\right\} dt\\ &\le \sqrt{\frac{2}{c}} K + \int_{t_0}^{\infty} \sum_i 2\exp\left\{ -\frac{c}{K^2}t^2 \right\} i^{-ct^2/K^2} dt \\ &\le \sqrt{\frac{2}{c}} K + \int_{0}^{\infty} \sum_i 2\exp\left\{ -\frac{c}{K^2}t^2 \right\} i^{-2} dt \\ &=\frac{\sqrt{2} + \frac{\pi^2\sqrt{\pi}}{6}}{\sqrt{c}} K \end{align} $$

As a side note, if we were to assume all $X_i$'s are independent, using Stirling's approximation will lead to the same result more easily.

$\endgroup$
11
$\begingroup$

Without loss of generality, assume that $K = c$ (the constant in the exponent of subgaussian tail).

\begin{eqnarray} \mathbb{E}\max \frac{|X_i|}{\sqrt{1+\log i}} &=& \int_0^\infty \mathbb{P}\left(\max \frac{|X_i|}{\sqrt{1+\log i}} > t \right) dt\\ &\leq& \int_0^2 \mathbb{P}\left(\max \frac{|X_i|}{\sqrt{1+\log i}} > t \right) dt + \int_2^\infty \mathbb{P}\left(\max \frac{|X_i|}{\sqrt{1+\log i}} > t \right) dt \\&\leq& 2 + \int_2^\infty \sum_{i=1}^N\mathbb{P}\left( \frac{|X_i|}{\sqrt{1+\log i}} > t \right) dt \\ &\leq& 2 + \int_2^\infty \sum_{i=1}^N 2 \exp\big(-t^2(1+\log(i))\big) dt\\ &\leq& 2 + 2\sum_{i=1}^N \int_2^\infty \exp(-t^2) \;\;i^{-t^2} dt \\ &\leq& 2 + 2\sum_{i=1}^N \int_2^\infty \exp(-\frac{ct^2}{K}) \;\;i^{-4} dt \leq \infty \end{eqnarray}

I choose 2 as the point to split two integrals to make the sum convergent. (you could have used other points).

$\endgroup$
1
  • 2
    $\begingroup$ So what's next? How to deduce $CK$? $\endgroup$ Commented Jun 30, 2019 at 3:56

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .