2
$\begingroup$

Let $X$ and $Y$ be two random variables (say real numbers, or vectors in some vector space). It seems to me that the following is true:

E [ X | E [ X | Y ] ] = E [ X | Y]

Note that E [ X | Y ] is a random variable in it's own right. Also note that equality here is point-wise, for every point in the sample space of the joint distribution on on $(X,Y)$. My question, assuming I'm not missing something and the above is true, is whether this law has a name, or is written down / proved somewhere.

$\endgroup$
1
  • $\begingroup$ This is an extremely important result. I failed a course in probability because I couldn't correctly apply it. I think this is called Bayes' rule for expectation something but there is a name $\endgroup$
    – Fraïssé
    Commented Jan 26, 2015 at 7:02

2 Answers 2

3
$\begingroup$

Let $Z=E[X\ | \ Y]$. Your equation states: $E[X \ | \ Z]=Z$. This follows from the following fact.

Tower Property of Conditional Expectation:

$$E[E[X\ | \ \mathcal{F}]\ | \ \mathcal{G}]=E[X\ | \ \mathcal{G}],\text{ whenever }\mathcal{G}\subset \mathcal{F}.$$

Proof of your equation:

We apply the tower property with $\mathcal{G}=\sigma(Z)$ and $\mathcal{F}=\sigma(Y)$. Note that $\sigma(Z)\subset \sigma(Y)$ follows from the construction of $Z$ as a conditional expectation w.r.t. $Y$.

Plugging in to the tower property, $$ \begin{align*} E[E[X\ | \ \sigma(Y)]\ | \ \sigma(Z)]&=E[X\ | \ \sigma(Z)]\\ \implies E[Z\ | \ Z]&=E[X\ | \ Z]\\ \implies Z&=E[X\ | \ Z]. \end{align*}$$

$\endgroup$
3
  • $\begingroup$ What is $\sigma$ here? $\endgroup$
    – sd234
    Commented Jan 26, 2015 at 9:50
  • $\begingroup$ It's a technical formality that means "the $\sigma$-algebra generated by a random variable". In general, conditional expectation is defined w.r.t. a $\sigma$-algebra, like $\mathcal{F}$ or $\mathcal{G}$. See the formal definition on the wikipedia page en.wikipedia.org/wiki/Conditional_expectation $\endgroup$
    – pre-kidney
    Commented Jan 26, 2015 at 9:53
  • $\begingroup$ Ah, yes of course. That's an elegant proof, thanks! $\endgroup$
    – sd234
    Commented Jan 26, 2015 at 10:15
-3
$\begingroup$

Essentially the law of iterated expectation, perhaps more commonly written like $$\operatorname{E_X} [X] = \operatorname{E}_Y [ \operatorname{E}_{X \mid Y} [ X \mid Y]].$$

For a discrete case, the essence of the proof is $$\operatorname{E}_Y [ \operatorname{E}_{X \mid Y} [ X \mid Y]] = \sum_y \sum_x x \cdot \operatorname{P}(X=x \mid Y=y) \cdot \operatorname{P}(Y=y) =\sum_x x \cdot \operatorname{P}(X=x) =\operatorname{E_X} [X].$$

$\endgroup$
8
  • $\begingroup$ Can you actually flesh out why the OP's equation is equivalent to the equation you have written down? $\endgroup$
    – pre-kidney
    Commented Jan 26, 2015 at 7:15
  • $\begingroup$ Yea, I'm not sure I see the equivalence either... $\endgroup$
    – sd234
    Commented Jan 26, 2015 at 7:22
  • 1
    $\begingroup$ What's with the subscripts? $\endgroup$ Commented Jan 26, 2015 at 7:45
  • $\begingroup$ I have read the question's $E [ X | E [ X | Y ] ]$ as $E_Y [ X | E [ X | Y ] ]$. If you read it as $E_{X|Y} [ X | E [ X | Y ] ]$ then it is trivial. $\endgroup$
    – Henry
    Commented Jan 26, 2015 at 8:21
  • 1
    $\begingroup$ But what's the significance of the subscript, i.e. why not just write $E[Z(Y)]$ as is commonly done? $\endgroup$ Commented Jan 26, 2015 at 8:28

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .