0
$\begingroup$

This is a bit of a silly question, but I need a way to reduce $o_\text{P}(f) + o(g)$ to one term. I know that $o(f) + o(g) = o(\max(f, g))$, but I am not sure how to extend the property when one of those terms is $o_\text{P}$ instead, and I can't seem to find a good answer online.

*Edit: Ahh, forgot to define $o_\text{P}$. I say $X_{n} = o_\text{P}(c_{n})$ for some constant $c_{n}$ (indexed by $n$) if, for all $\epsilon > 0$, $$ \lim_{n \to \infty}\Pr\left(\left|\frac{X_{n}}{c_{n}}\right| \geq \epsilon \right) = 0. $$ This definition is rather controversial. A more popular definition: for all $\delta, \epsilon > 0$, there exists some $n_{0}$ such that, for all $n > n_{0}$, $$\Pr\left(\left|\frac{X_{n}}{c_{n}}\right| > \delta \right) \leq \epsilon. $$ So it's essentially an extension of $o()$ to random variables; thus, if $X_{n} = o_\text{P}(c_{n})$ and $Y_{n} = o_\text{P}(d_{n})$, then $X_{n} + Y_{n} = o_\text{P}(\max(c_{n}, d_{n}))$.

$\endgroup$
8
  • 3
    $\begingroup$ What does $o_P$ mean? $\endgroup$ Commented May 21 at 0:12
  • $\begingroup$ @Thomas Andrews Realized I forgot to define $o_\text{P}$, it's convergence in probability. Post edited $\endgroup$
    – JerBear
    Commented May 21 at 0:36
  • $\begingroup$ Isn't $o_P$ weaker than $o$? So $o(g)$ implies $o_P(g)$, and thus $o_P(f) + o(g) = o_P(\max\{f,g\})$? $\endgroup$ Commented May 21 at 1:00
  • 1
    $\begingroup$ Since your definitions allow $c_n$ to be negative, with $o_P(-c_n) = o_P(c_n)$, I think you want $\max(|f|, |g|)$, not $\max(f,g)$. $\endgroup$ Commented May 21 at 3:45
  • 1
    $\begingroup$ @JerBear I don't think that conclusion is right: when $g=f$ you're asserting that $o_P(f) + o(f) = o(f)$ which I think is easily refuted. Remember that statements like $x=o(y)$ are not reversible in general (despite the use of the equals sign): even if $x=o(y)$, we can't replace any $o(y)$ we see by $x$. $\endgroup$ Commented May 21 at 3:55

0

You must log in to answer this question.

Browse other questions tagged .