0
$\begingroup$

My question is exactly equal to the question posted at Expected value of decreasing function of random variable versus expected value of random variable with just one extra assumption: the two random variables $X_1$ and $X_2$ have the same variance. Can we conclude in such case that $\mathbb{E}[g(X_1)] < \mathbb{E}[g(X_2)]$? If not, what extra assumption would we need for this to be true?

$\endgroup$
3
  • $\begingroup$ Why not consider the answers in that thread? $\endgroup$
    – whuber
    Commented Jun 11 at 12:48
  • $\begingroup$ The answer is going to be no, and it will not be difficult to construct counterexamples. Find two distributions with different shapes, then relocate and rescale the second to match the expectation and variance of the first. Find a decreasing function which gives different expectations when applied. Relocate your first initial distribution slightly (so not changing its variance or the inequality on the decreasing function) to produce the counterexample. This method will fail if your decreasing function is linear, which is a strong extra assumption. $\endgroup$
    – Henry
    Commented Jun 11 at 14:19
  • $\begingroup$ Will a condition on the convexity of the function help? Perhaps exploiting Jensen's inequality in such case? $\endgroup$
    – irodr
    Commented Jun 12 at 7:10

0

Browse other questions tagged or ask your own question.