Consider the definition of small o in probability.
Let $X_n$ be a sequence of random variables and let $X_n=1+o_p(1)$. That is, for any $\delta,\epsilon>0: P(\lvert X_n-1\rvert\geq \delta)\leq \epsilon$ for sufficiently large $n$. The sequence converges in probability to one.
This led me to conjecture that there should be a constant $1>M>0$ so that $P(\lvert X_n\rvert\geq M)\to1$ as $n\to+\infty$. Or equivalently, $P(\lvert X_n\rvert < M)\to 0$ .
Question. If it is true, then how to prove it?