14
$\begingroup$

I would like to show the following statement using the general definition of conditional expectation. I believe it is true as it was also pointed out in other posts.

Let $X,Y$ be conditionally independent random variables w.r.t a sigma algebra $\mathcal{G}$. Furthermore the expected values of $X,Y$ and $XY$ exist.Then $$\mathbb{E}(XY\mid \mathcal{G})= \mathbb{E}(X\mid \mathcal{G})\mathbb{E}(Y\mid \mathcal{G}).$$

Using the definition of conditional expectation that is used on wikipedia it suffices to show that $(1)$ $\mathbb{E}(X\mid \mathcal{G})\mathbb{E}(Y\mid \mathcal{G})$ is $\mathcal{G}$-measurable and $(2)$ that $\mathbb{E}(\mathbb{E}(X\mid \mathcal{G})\mathbb{E}(Y\mid \mathcal{G})\mathbf{1}_A)=\mathbb{E}(XY\mathbf{1}_A)$ for all $A \in \mathcal{G}$. $(1)$ is clear as the product of two $\mathcal{G}$-measurable functions is $\mathcal{G}$-measurable. For the second part I am able to deduce $\mathbb{E}(XY\mathbf{1}_A)=\mathbb{E}(X\mathbf{1}_A)\mathbb{E}(Y\mathbf{1}_A)/\mathbb{P}(A)=\mathbb{E}(\mathbb{E}(X\mid \mathcal{G})\mathbf{1}_A)\mathbb{E}(\mathbb{E}(Y\mid \mathcal{G})\mathbf{1}_A)/\mathbb{P}(A)$ using conditional independence and the definition of conditional expectation. But then I am lacking any idea to show that the latter equals the LHS in $(2)$. Do you have any ideas how to proceed?

$\endgroup$
2
  • $\begingroup$ Arguably that should be the definition of conditional independence. $\endgroup$
    – user711689
    Commented Jul 10, 2021 at 4:29
  • $\begingroup$ This should rather be the definition of conditional uncorrelatedness. $\endgroup$
    – Matija
    Commented Oct 5, 2022 at 16:37

1 Answer 1

9
$\begingroup$

I think we should start from the definition of conditional independence. Two random variables $X$ and $Y$ are conditionally independent w.r.t. $\mathcal{G}$ if for any $B\in \sigma(X)$ and $D\in \sigma(Y)$ we have $$ P(B\cap D|\mathcal{G}) = P(B|\mathcal{G})P(D|\mathcal{G}). $$ Starting from this definition (given on wikipedia), we can prove \begin{equation} E[XY|\mathcal{G}] = E[X|\mathcal{G}]E[Y|\mathcal{G}]\ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ (1) \end{equation} by following the “Standard Machinery” in probability and measure theory. It contains three steps:

1) If both $X$ and $Y$ are simple functions, i.e., $X = \sum_{i=1}^n x_iI_{A_i}$, $Y = \sum_{i=1}^m y_iI_{B_i}$ for two finite measurable division of the whole space $\Omega = \cup_{i=1}^n A_i = \cup_{i=1}^m B_i$, then the equation (1) can be proved easily.

2) If $X$ and $Y$ are non-negative, then there exist two sequences of simple functions $\{f_n\}$ and $\{g_n\}$ such that $f_n\uparrow X$ and $g_n\uparrow Y$. Since $X$ and $Y$ have finite expectation, by the dominant convergence theorem for conditional expectation we have $$ E[XY|\mathcal{G}] = \lim_{n\rightarrow \infty}E[f_ng_n|\mathcal{G}] = \lim_{n\rightarrow \infty}E[f_n|\mathcal{G}]E[g_n|\mathcal{G}] = E[X|\mathcal{G}]E[Y|\mathcal{G}]. $$ 3)For general $X$ and $Y$, we define $X^+(\omega) = \max\{X(\omega),0\}\geq 0$, $X^-(\omega) = \max\{-X(\omega),0\}\geq 0$, and write $X = X^+-X^-$, $Y = Y^+-Y^-$. Now using the result in 2) to $X^+,X^-,Y^+,Y^-$ (if $X$ and $Y$ are conditionally independent, then so should $X^+$ and $Y^+$, and so on), we can prove the equality for $X$ and $Y$.

$\endgroup$

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .