0
$\begingroup$

Let me define it first:

Let $X$ an integrable random variable on a probability space $(\Omega, S, P)$, and $C$ is a sub-sigma algebra of $S$. Then there exists a unique $C$-measurable random variable $Y$ such that $\int_{E}YdP$=$\int_{E}XdP$ for all $E\in C$. This random variable $Y$ is called the expectation of $X$ given $C$ and denoted by $E(X/C)$.

What I know is if we have a random variable $X$ on a probability space then the expectation is given by $E[X]$=$\int_{\Omega}XdP$ now I want to understand what the above definition of conditional expectation is saying, like if someone can give some example related to the above definition then that will be helpful. Thanks

$\endgroup$
1
  • 2
    $\begingroup$ Try using an example from the counting measure (discrete random variable). $\endgroup$ Commented Sep 10, 2022 at 20:08

2 Answers 2

2
$\begingroup$

See chapter 4 of https://services.math.duke.edu/~rtd/PTE/PTE5_011119.pdf for more detail and examples. My favorite example is the one from undergrad probability: Let $X$ and $Y$ be discrete random variables. Then $$E(X \mid Y = y) = \sum_{x}xP(X = x \mid Y = y) = \frac{1}{P(Y = y)}\sum_{x}P(X = x, Y = y) = \frac{1}{P(Y = y)}E(X 1_{Y = y}).$$ This says that $E(X \mid Y = y)$ is the average of $X$ over the set where $Y = y$. In general, assuming $E(X^2) < \infty$, you can show that $E(X \mid Y)$ is the function $h(Y)$ of $Y$ that minimizes $E((h(Y) - X)^2)$. So in this sense, $E(X \mid Y)$ is the closest function of $Y$ to $X$.

$\endgroup$
1
$\begingroup$

A very simple example can be given with an standard dice. Let the probability space $(\Omega ,\mathcal{F},P)$ where $\Omega :=\{1,\ldots ,6\}$, $\mathcal{F}:=2^{\Omega }$ and $P(\{\omega \}):=\frac1{6}$ for every $\omega \in \Omega $.

Now suppose that the random variable $X:\Omega \to \mathbb{R}$ represent a dice, it means that $X(\omega )=\omega $ so $P( X=k)=\frac1{6}$ when $k\in\{1,\ldots ,6\}$, and is zero otherwise. Then a sub-$\sigma $-algebra of $\mathcal{F}$ is $\mathcal{G}:=\{\{1,3,5\},\{2,4,6\},\emptyset ,\Omega \}$, then we have that

$$ \mathrm{E}[X|\mathcal{G}](\omega )=\begin{cases} \frac1{P(\{1,2,3\})}\int_{\{1,3,5\}}X\,d P,&\text{ when }\omega \in\{1,3,5\}\\ \frac1{P(\{2,4,6\})}\int_{\{2,4,6\}}X\,d P,&\text{ when }\omega \in\{2,4,6\} \end{cases} $$

The previous relation follows from the fact that if $A\in \mathcal{G}$ is an atom of the probability space $(\Omega , \mathcal{G}, P)$ then $\mathrm{E}[X|\mathcal{G}]$ can be chosen to be constant in $A$, as there is no $B\subset A$ with $P(B)<P(A)$ and $P(B)\neq 0$. Then from the equality

$$ \int_{A}\mathrm{E}[X|\mathcal{G}]\,d P=\int_{A}X\,d P,\quad \text{ for every }A\in \mathcal{G} $$

and if $\mathrm{E}[X|\mathcal{G}]$ is constant in $A$, it follows that $\mathrm{E}[X|G](\omega )=\frac1{P(A)}\int_{A}X\,d P$ for every $\omega \in A$ (notice that we also can set the above instead of "for every $\omega \in A$" as "for almost every $\omega \in A$").

$\endgroup$
2
  • $\begingroup$ can you please explain to me why we are saying that the random variable $Y$ on $C$ will be the conditional expectation of $X$ given $C$? I mean why we are taking $E(X/C)=Y$ $\endgroup$
    – Andyale
    Commented Sep 11, 2022 at 7:07
  • $\begingroup$ I had a mistake in the answer, I fixed it and I have an explanation for the given result $\endgroup$
    – Masacroso
    Commented Sep 11, 2022 at 10:26

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .