The definition of convergence in probability of a sequence of random variables is:
for every $\epsilon > 0$
$P(|Y - c| \ge \epsilon) \xrightarrow{} 0 $ as $n \xrightarrow{} \infty$
From my statistics text I have noticed the following theorem:
A sufficient condition that $Y_{n} \xrightarrow{p} c \:\:$ is that $\:\:E(Y_{n} - c)^{2} \xrightarrow{} 0\:\:$ as $\:n \xrightarrow{} \infty$
How do I prove this?
I assume that this have something to do with Chebychev's inequality which states:
For any random variable Y and constants $a > 0$ and c,$\\$
$P(|Y - c| \ge a) \le \frac{E(Y - c)^{2}}{a^{2}}$