1
$\begingroup$

The definition of convergence in probability of a sequence of random variables is:

for every $\epsilon > 0$

$P(|Y - c| \ge \epsilon) \xrightarrow{} 0 $ as $n \xrightarrow{} \infty$


From my statistics text I have noticed the following theorem:

A sufficient condition that $Y_{n} \xrightarrow{p} c \:\:$ is that $\:\:E(Y_{n} - c)^{2} \xrightarrow{} 0\:\:$ as $\:n \xrightarrow{} \infty$


How do I prove this?


I assume that this have something to do with Chebychev's inequality which states:

For any random variable Y and constants $a > 0$ and c,$\\$

$P(|Y - c| \ge a) \le \frac{E(Y - c)^{2}}{a^{2}}$

$\endgroup$
2
  • $\begingroup$ You already have a proof. I don't know why you asked this question. Replace $Y$ by $Y_n$ in your inequality and you are done. $\endgroup$ Commented Sep 11, 2021 at 8:14
  • $\begingroup$ But how does convergence with inequalities work? Is there rules involved? Can you point me to a source? $\endgroup$ Commented Sep 11, 2021 at 8:17

1 Answer 1

0
$\begingroup$

Given two non-negative sequences $(a_n)_{n \in \mathbb{N}}$ and $(b_n)_{n \in \mathbb{N}}$ s.t. $b_n\geq a_n\,\forall n \in \mathbb{N}$ and $b_n \to 0$, then $a_n\to 0$. To see this, notice that $\exists N \in \mathbb{N}$ s.t. $b_n <\varepsilon,\,\forall n \geq N$ for arbitrary $\varepsilon >0$, but this $N$ also works for $a_n$ since $a_n \leq b_n < \varepsilon$ so $a_n \to 0$. By using the inequality you stated, for arbitrary $\eta >0$ we can see that $$\underbrace{P(|X_n-c|>\eta)}_{:=a_n}\leq \underbrace{\frac{E[|X_n-c|^2]}{\eta^2}}_{:=b_n}\to 0$$ So $\lim_{n \to \infty }P(|X_n-c|>\eta)=0$ and we proved that $X_n \stackrel{P}{\to} c$.

$\endgroup$

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .