Skip to main content
added 163 characters in body
Source Link
whuber
  • 328.5k
  • 61
  • 768
  • 1.3k

Let $X_i$ be independent random variables with the same distribution as $X.$ Because $g$ is weakly increasing if and only if $(g(x_2)-g(x_1))(x_2-x_1)\ge 0$ for all real numbers $x_i,$

$$\operatorname{Cov}(X,g(X)) = \frac{1}{2} E[(X_2-X_1)(g(X_2)-g(X_1))] \ge \frac{1}{2}E[0] = 0,$$

QED. (See How would you explain covariance to someone who understands only the mean? for an explanation of this formula for covariance.)


An alternative is to exploit the basic invariance property of covariance with respect to changes of location. Observe that $g$ is weakly increasing if and only if $x\to g(x)+a$ is weakly increasing for any constant $a.$ Thus, without any loss of generality you may assume $x g(x)\ge 0$ for all $x$ by choosing $a = -g(0).$ Consequently, initially shifting $X$ if necessary to make $E[X]=0,$

$$\operatorname{Cov}(X,g(X)) = E[X g(X)]\ge E[0] = 0,$$

QED.


Illustrating both approaches is this visual proof without words.

enter image description here

Let $X_i$ be independent random variables with the same distribution as $X.$ Because $g$ is weakly increasing if and only if $(g(x_2)-g(x_1))(x_2-x_1)\ge 0$ for all real numbers $x_i,$

$$\operatorname{Cov}(X,g(X)) = \frac{1}{2} E[(X_2-X_1)(g(X_2)-g(X_1))] \ge \frac{1}{2}E[0] = 0,$$

QED. (See How would you explain covariance to someone who understands only the mean? for an explanation of this formula for covariance.)


An alternative is to exploit the basic invariance property of covariance with respect to changes of location. Observe that $g$ is weakly increasing if and only if $x\to g(x)+a$ is weakly increasing for any constant $a.$ Thus, without any loss of generality you may assume $x g(x)\ge 0$ for all $x$ by choosing $a = -g(0).$ Consequently, initially shifting $X$ if necessary to make $E[X]=0,$

$$\operatorname{Cov}(X,g(X)) = E[X g(X)]\ge E[0] = 0,$$

QED.

Let $X_i$ be independent random variables with the same distribution as $X.$ Because $g$ is weakly increasing if and only if $(g(x_2)-g(x_1))(x_2-x_1)\ge 0$ for all real numbers $x_i,$

$$\operatorname{Cov}(X,g(X)) = \frac{1}{2} E[(X_2-X_1)(g(X_2)-g(X_1))] \ge \frac{1}{2}E[0] = 0,$$

QED. (See How would you explain covariance to someone who understands only the mean? for an explanation of this formula for covariance.)


An alternative is to exploit the basic invariance property of covariance with respect to changes of location. Observe that $g$ is weakly increasing if and only if $x\to g(x)+a$ is weakly increasing for any constant $a.$ Thus, without any loss of generality you may assume $x g(x)\ge 0$ for all $x$ by choosing $a = -g(0).$ Consequently, initially shifting $X$ if necessary to make $E[X]=0,$

$$\operatorname{Cov}(X,g(X)) = E[X g(X)]\ge E[0] = 0,$$

QED.


Illustrating both approaches is this visual proof without words.

enter image description here

added 2 characters in body
Source Link
whuber
  • 328.5k
  • 61
  • 768
  • 1.3k

Let $X_i$ be independent random variables with the same distribution as $X.$ Because $g$ is weakly increasing if and only if $(g(x_2)-g(x_1))(x_2-x_1)\ge 0$ for all real numbers $x_i,$

$$\operatorname{Cov}(X,g(X)) = \frac{1}{2} E[(X_2-X_1)(g(X_2)-g(X_1))] \ge \frac{1}{2}E[0] = 0,$$

QED. (See How would you explain covariance to someone who understands only the mean? for an explanation of this formula for covariance.)


An alternative is to exploit the basic invariance property of covariance with respect to changes of location. Observe that $g$ is weakly increasing if and only if $x\to g(x)+a$ is weakly increasing for any constant $a.$ Thus, without any loss of generality you may assume $E[X] = 0$ and $x g(x)\ge 0$ for all $g(0)=0,$ entailing$x$ by choosing $X g(X)\ge 0.$$a = -g(0).$ Consequently, initially shifting $X$ if necessary to make $E[X]=0,$

$$\operatorname{Cov}(X,g(X)) = E[X g(X)]\ge E[0] = 0,$$

QED.

Let $X_i$ be independent random variables with the same distribution as $X.$ Because $g$ is weakly increasing if and only if $(g(x_2)-g(x_1))(x_2-x_1)\ge 0$ for all real numbers $x_i,$

$$\operatorname{Cov}(X,g(X)) = \frac{1}{2} E[(X_2-X_1)(g(X_2)-g(X_1))] \ge \frac{1}{2}E[0] = 0,$$

QED. (See How would you explain covariance to someone who understands only the mean? for an explanation of this formula for covariance.)


An alternative is to exploit the basic invariance property of covariance with respect to changes of location. Observe that $g$ is weakly increasing if and only if $x\to g(x)+a$ is weakly increasing for any constant $a.$ Thus, without any loss of generality you may assume $E[X] = 0$ and $g(0)=0,$ entailing $X g(X)\ge 0.$ Consequently,

$$\operatorname{Cov}(X,g(X)) = E[X g(X)]\ge E[0] = 0,$$

QED.

Let $X_i$ be independent random variables with the same distribution as $X.$ Because $g$ is weakly increasing if and only if $(g(x_2)-g(x_1))(x_2-x_1)\ge 0$ for all real numbers $x_i,$

$$\operatorname{Cov}(X,g(X)) = \frac{1}{2} E[(X_2-X_1)(g(X_2)-g(X_1))] \ge \frac{1}{2}E[0] = 0,$$

QED. (See How would you explain covariance to someone who understands only the mean? for an explanation of this formula for covariance.)


An alternative is to exploit the basic invariance property of covariance with respect to changes of location. Observe that $g$ is weakly increasing if and only if $x\to g(x)+a$ is weakly increasing for any constant $a.$ Thus, without any loss of generality you may assume $x g(x)\ge 0$ for all $x$ by choosing $a = -g(0).$ Consequently, initially shifting $X$ if necessary to make $E[X]=0,$

$$\operatorname{Cov}(X,g(X)) = E[X g(X)]\ge E[0] = 0,$$

QED.

added 422 characters in body
Source Link
whuber
  • 328.5k
  • 61
  • 768
  • 1.3k

Let $X_i$ be independent random variables with the same distribution as $X.$ Because $g$ is weakly increasing if and only if $(g(x_2)-g(x_1))(x_2-x_1)\ge 0$ for all real numbers $x_i,$

$$\operatorname{Cov}(X,g(X)) = \frac{1}{2} E[(X_2-X_1)(g(X_2)-g(X_1))] \ge \frac{1}{2}E[0] = 0,$$

QED. (See How would you explain covariance to someone who understands only the mean? for an explanation of this formula for covariance.)


An alternative is to exploit the basic invariance property of covariance with respect to changes of location. Observe that $g$ is weakly increasing if and only if $x\to g(x)+a$ is weakly increasing for any constant $a.$ Thus, without any loss of generality you may assume $E[X] = 0$ and $g(0)=0,$ entailing $X g(X)\ge 0.$ Consequently,

$$\operatorname{Cov}(X,g(X)) = E[X g(X)]\ge E[0] = 0,$$

QED.

Let $X_i$ be independent random variables with the same distribution as $X.$ Because $g$ is weakly increasing if and only if $(g(x_2)-g(x_1))(x_2-x_1)\ge 0$ for all real numbers $x_i,$

$$\operatorname{Cov}(X,g(X)) = \frac{1}{2} E[(X_2-X_1)(g(X_2)-g(X_1))] \ge \frac{1}{2}E[0] = 0,$$

QED. (See How would you explain covariance to someone who understands only the mean? for an explanation of this formula for covariance.)

Let $X_i$ be independent random variables with the same distribution as $X.$ Because $g$ is weakly increasing if and only if $(g(x_2)-g(x_1))(x_2-x_1)\ge 0$ for all real numbers $x_i,$

$$\operatorname{Cov}(X,g(X)) = \frac{1}{2} E[(X_2-X_1)(g(X_2)-g(X_1))] \ge \frac{1}{2}E[0] = 0,$$

QED. (See How would you explain covariance to someone who understands only the mean? for an explanation of this formula for covariance.)


An alternative is to exploit the basic invariance property of covariance with respect to changes of location. Observe that $g$ is weakly increasing if and only if $x\to g(x)+a$ is weakly increasing for any constant $a.$ Thus, without any loss of generality you may assume $E[X] = 0$ and $g(0)=0,$ entailing $X g(X)\ge 0.$ Consequently,

$$\operatorname{Cov}(X,g(X)) = E[X g(X)]\ge E[0] = 0,$$

QED.

added 134 characters in body
Source Link
whuber
  • 328.5k
  • 61
  • 768
  • 1.3k
Loading
Source Link
whuber
  • 328.5k
  • 61
  • 768
  • 1.3k
Loading