I am working with the various types of convergence of random variables. I came across this exercise:
Consider the following sequence of independent r.vs : $(X_i, i\geq1) : Pr(X_i=i^a)=\frac{1}{i^b}, Pr(X_i=0) = 1-\frac{1}{i^b}$ with $a,b>0$ What are the minimal condition for $a,b$ s.t. $X_i\to 0$ considering quadratic convergence, in probabilty and almost sure convergence. I thought of something among these lines: For the quadratic: I need $E(X_i^2)\to 0, i \to +\infty$ hence, $ \frac{(i^a)^2}{n^b} \to 0, i\to +\infty$ which means $a < b/2$ Is that right?
As for the almost sure convergence, is it ok for every $a,b$? Because I would say that I need $a<0$ in order to be sure to have always $X_i=0$ but since I can't, it will always be $\neq 0$ with prob $1/i^b$ I need to nullify the probability of it being different then $0$ and thus, $b>0$ is ok. What happens if the condition are not verified? It will still converge, right? Even if not almost surely?
For the convergence in probability I have no idea, honestly, on how to estimate $Pr(|X_i|>\epsilon)$ as $i\to \infty$. Is it $Pr(i^a>\epsilon)$? And to have it go to $0$ I need $a>log_i(\epsilon)$ and $b>0$?