2
$\begingroup$

The question is-

Let $X_1,X_2,..,X_n$ be iid random variables from a continuous distribution whose density is symmetric about $0$. Suppose $\mathbb{E}(|X_1|)=2$ and define $Y=\sum_{i=1}^{n}X_i$ and $Z=\sum_{i=1}^{n}I(X_i>0)$. Then calculate covariance between $Y$ and $Z$.

My attempt:

$E(X_i)=0$ for all $i=1(1)n$ because $X$ is symmetric about $0$ and $E(|X|) $ exists.

Now,

$Cov (Y,Z)=E(YZ)-E(Y)E(Z)$ $=E(YZ)-0$ $=E[(\sum_{i=1}^{n}X_i)(\sum_{i=1}^{n}I(X_i>0)]$

$=(\sum_{i=1}^{n}E[(X_i.I(X_i>0))]$ $+\sum\sum_{i \neq j}E(X_i)E(I(X_j>0)$ as $X_i,X_j$ are independent.

$=\sum_{i=1}^{n}E[(X_i.I(X_i>0)] +0 $ as $E(X_i)=0$

$ =\sum_{i=1}^{n}\{E[X_i.I(X_i>0)|I(X_i>0)=1]×1/2] + E[X_i.I(X_i>0)|I(X_i>0)=0]×1/2]\}$

$=\sum_{i=1}^{n}E[X_i.I(X_i>0)|I(X_i>0)=1]×1/2] +0$

$=\sum_{i=1}^{n}E[X_i|X_i>0]×1/2]$

$=2n×(1/2)$ $=n$

Is my reasoning correct ? Thanks in advance!

$\endgroup$
8
  • 1
    $\begingroup$ Note: you use independence the line before what you state, when you already "separate" the expectations in the second sum. $\endgroup$
    – Clement C.
    Commented Jul 20, 2019 at 3:26
  • $\begingroup$ @ClementC. Yeah,gonna edit it. But is the method correct? $\endgroup$ Commented Jul 20, 2019 at 3:27
  • $\begingroup$ I am not convinced the last lines are correct, no. The way you handle the conditional expectation to relate it to the expectation of the abssolute value. $\endgroup$
    – Clement C.
    Commented Jul 20, 2019 at 3:28
  • $\begingroup$ @ClementC. I used the formula $E(X)=E(X|A)P(A) +E(X)=E(X|A^c)P(A^c)$ $\endgroup$ Commented Jul 20, 2019 at 3:30
  • $\begingroup$ Yes, but how do you derive $\mathbb{E}[ X_i I(X_i > 0) | X_i > 0 ] = \mathbb{E}[|X|]$? That's the part missing. $\endgroup$
    – Clement C.
    Commented Jul 20, 2019 at 3:33

2 Answers 2

2
$\begingroup$

You have a missing part at the end: you didn't show how you get

$\mathbb{E}[ X_i I(X_i > 0) | X_i > 0 ] = \mathbb{E}[|X|]$ which is something you rely on at the very end.


Below is an argument avoiding conditional expectations altogether.

Assume $X$ is continuous (in particular, no mass at $0$) and symmetric around 0. You have $$ \mathbb{E}[X \cdot I(X>0)] = \mathbb{E}[|X| \cdot I(X>0)] = \mathbb{E}[|X| - |X|\cdot I(X<0)] = \mathbb{E}[|X|] - \mathbb{E}[|X|\cdot I(X<0)] \tag{1} $$ but, by symmetry of $X$ around 0, $X$ and $-X$ have same distribution, and so $$\mathbb{E}[|X|\cdot I(X<0)] = \mathbb{E}[|-X|\cdot I[-X<0]] = \mathbb{E}[|X|\cdot I(X>0)]\tag{2}$$ so that, from (1), $$ \mathbb{E}[X \cdot I(X>0)] = \frac{1}{2}\mathbb{E}[|X|] $$ allowing you to conclude from what you wrote at the beginning (the first 3 equations).

$\endgroup$
15
  • $\begingroup$ So actual answer would be $n/4$? $\endgroup$ Commented Jul 20, 2019 at 3:46
  • $\begingroup$ I don't get how the eqality happens in 1. $\endgroup$ Commented Jul 20, 2019 at 4:00
  • 1
    $\begingroup$ @user587126 Yes, there are several equalities in there. The first should be clear: $X\cdot I(X>0)=|X|\cdot I(X>0)$. The second is from $|X| =|X|\cdot (I(X>0)+I(X=0)+I(X<0))==|X|\cdot (I(X>0)+I(X<0))$ (as $X$ is continuous, $I(X=0)$ is 0 a.s.). The last one is from the previous, using linearity of expectation. $\endgroup$
    – Clement C.
    Commented Jul 20, 2019 at 6:03
  • 1
    $\begingroup$ I would suggest you write your whole derivation, step by step, somewhere, and check it step by step, instead of shattering it across comments. You claim your derivation yields $$ \mathbb{E}[ X I(X>0) ] =\mathbb{E}[ |X| ] $$ However, this conclusion is wrong (if you cannot see why, check with an example, e.g., standard Gaussian). Hence, a step is wrong. @user587126 $\endgroup$
    – Clement C.
    Commented Jul 20, 2019 at 9:53
  • 1
    $\begingroup$ @user587126 Sorry I cannot help further with checking where exactly your mistake was (I am not in front of my laptop right now). In general, I'd suggest avoiding conditional expectations whenever you can -- using them is much trickier than other techniques.,, $\endgroup$
    – Clement C.
    Commented Jul 20, 2019 at 9:57
1
$\begingroup$

You have correctly shown that

\begin{align} \mathbb{Cov}\left(\sum_{i=1}^n X_i,\sum_{j=1}^n I(X_j>0)\right)&=\sum_{i=1}^n \sum_{j=1}^n\mathbb{Cov}(X_i,I(X_j>0)) \\&=\sum_{i=1}^n \mathbb{Cov}(X_i,I(X_i>0))+\sum_{i\ne j}^n \underbrace{\mathbb{Cov}(X_i,I(X_j>0))}_{0} \\&=\sum_{i=1}^n \mathbb{E}(X_1I(X_1>0)) \end{align}

Now just use these equations which follow from the law of total expectation: $$\mathbb{E}(X_1)=\mathbb{E}(X_1 I(X_1>0))+\mathbb{E}(X_1 I(X_1<0))$$

and $$\mathbb{E}(|X_1|)=\mathbb{E}(X_1 I(X_1>0))+\mathbb{E}(-X_1 I(X_1<0))$$

The above can be written using conditional expectations of course but there is no need for that.

$\endgroup$
2
  • $\begingroup$ Thanks. But what is wrong in my method. But, if i calculate $E(X.I(X>0)|X>0)=E(X.1|X>0)$ like this- the pdf of $X|X>0$ is $f_X(x)/P(X>0)= 2.f_X(x)$ , right?. So the required expectation is nothing but : $2. \int_{0}^{\infty} x.f_X(x)dx=2.1=2$ which does not matches with your answer which is $2/2=1$. I don't know what I am missing here. Please help. $\endgroup$ Commented Jul 20, 2019 at 9:52
  • 3
    $\begingroup$ I don't see where it leads to if you condition on these events. This is completely unnecessary. Keep in mind that $E(X_1 \mid X_1>0)=\frac{1}{P(X_1>0)}E(X_1 I(X_1>0))$. $\endgroup$ Commented Jul 20, 2019 at 10:02

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .