24
$\begingroup$

I am confused about the notion of $\sigma$-algebras representing information and what information is contained in $\sigma(X)$ for a random variable $X$.

Suppose $(\Omega, \mathcal{F}, \mathbb{P})$ is a probability triple, and $(Y_\gamma : \gamma \in C)$ is a collection of random variables. I am reading that $\sigma(Y_\gamma : \gamma \in C)$ consists of all the events $F \in \mathcal{F}$ such that for all $\omega \in \Omega$ it is possible to determine whether or not $\omega \in F$ given $(Y_\gamma(\omega) : \gamma \in C)$.

I don't even see how this makes sense if we restrict ourselves to a single random variable $X$. Suppose I know $X(\omega)$. I can only be sure $F$ occurred if $F\supseteq X^{-1}(\{X(\omega)\})$ and I can only be sure that $F$ didn't occur if $X(\omega) \notin X(F)$. But we know that $\sigma(X) = \sigma(\{X^{-1}(B) : B \in \mathcal{B}\})$. Where are all the other sets coming from?

$\endgroup$

4 Answers 4

18
$\begingroup$

Maybe it will be helpful to look at the case where $X$ has a small finite range, e.g., $X:\Omega\to \{1,2,3,4\}$. In this case, we can partition the points in $\Omega$ into four disjoint subsets, $A:=X^{-1}(1)$, $B:=X^{-1}(2)$, $C:=X^{-1}(3)$, and $D:=X^{-1}(4)$. Given the value of $X$ at some $\omega$, we then know which of $A$, $B$, $C$, or $D$ contains $\omega$, so we can tell whether or not each of the events $A$, $B$, $C$, or $D$ happened, and $\sigma(X)$ must contain at least these four events. But we can also tell whether $B\cup C$ happened, since this event occurs just when $X(\omega)\in\{2,3\}$. So, $B\cup C\in \sigma(X)$ as well, and we also have several more sets; explicitly, $\sigma(X)$ is the Boolean algebra generated by $A$, $B$, $C$, and $D$: $$\sigma(X)=\{\emptyset,A,B,C,D,A\cup B,A\cup C,A\cup D,B\cup C,B\cup D,C\cup D,A\cup B\cup C,A\cup B\cup D,A\cup C\cup D, \Omega\}.$$

In the general case, any element of $\sigma(X)$ will be a union $\cup_{s\in S} X^{-1}(s)$, where $S$ is a measurable subset of the range of $X$.

$\endgroup$
4
  • $\begingroup$ Okay, I see how we can tell if anything in $\sigma(X^{-1}(\{s\}) : s \in \Omega)$ happened, but how is it extended $\sigma(X^{-1}(S): S \in \mathcal{B})$? I'm pretty sure singletons don't generate the Borel $\sigma$-algebra. $\endgroup$
    – nullUser
    Commented Jan 23, 2013 at 21:51
  • 1
    $\begingroup$ If you want to know whether the event $E:=X^{-1}([2,3])$ happened, and you know the value of $X(\omega)$, then $E$ happened iff $2\le X(\omega)\le 3$. So, whether or not $E$ happened can be determined from $X(\omega)$. $\endgroup$ Commented Jan 23, 2013 at 21:56
  • 1
    $\begingroup$ Okay. But then how come I can't determine $X^{-1}(A)$ when $A$ is nonmeasurable? $\endgroup$
    – nullUser
    Commented Jan 23, 2013 at 22:03
  • $\begingroup$ You can determine whether or not this event happened (given $X(\omega)$), but you may not be able to find its probability, since $A$ is nonmeasurable. So, it's left out of the $\sigma$-algebra. $\endgroup$ Commented Jan 23, 2013 at 22:12
14
$\begingroup$

A sense in which the $\sigma$-algebra generated by a random variable represents information is given by the Doob-Dynkin Lemma:

Result: Let $(\Omega,\mathcal{F})$ be a measurable space and $f:\Omega\to\mathbb{R}$ be a random variable. Let $g:\Omega\to\mathbb{R}$ be a function. Then $g$ is $\sigma(f)$-measurable if and only if there is a measurable function $h:\mathbb{R}\to\mathbb{R}$ such that $g=h\circ f.$

Here is a sense in which a $\sigma$-algebra may fail to represent information (the example is from Billingsley): Let $\Omega=[0,1]$, $\mathcal{F}$ be the Borel sets and let $\mu$ be Lebesgue measure. Let $\mathcal{C}\subseteq\mathcal{F}$ be the $\sigma$-algebra consisting of countable sets and sets with countable complements. There is no random variable that generates $\mathcal{C}$. Since $\mathcal{C}$ contains all singletons, it should be in some sense perfectly informative. But the conditional expectation $\mathbb{E}_\mathcal{C}$ is equal to a constant function almost surely- and so is every $\mathcal{C}$-measurable function. To see this, note that $\mathcal{C}$ contains only events with Lebesgue measure $1$ or $0$. Let $g:\Omega\to\mathbb{R}$ be $\mathcal{C}$-measurable. Without loss of generality, let $g(\Omega)\subseteq [0,1]$. One of the sets $g^{-1}[0,1/2]$ and $g^{-1}[1/2,1]$ must have measure $1$. Say, it is $g^{-1}[1/2,1]$. Then one of the sets $g^{-1}[1/2,3/4]$ and $g^{-1}[3/4,1]$ must have measure $1$. Continuing this way, we get a decreasing sequence of closed intervals that all have measure $1$ under the distribution of $g$ and their diameter goes to $0$. So there exists a unique point $r$ in the intersection and $g^{-1}\{r\}$ has measure $1$. So $g$ is almost surely equal to the random variable that is constantly $r$. And $\mathcal{C}$ is completely uninformative.

$\endgroup$
10
  • $\begingroup$ I was wondering how the lemma is used to explain the sigma algebra generated by a random variable represents information? $\endgroup$
    – Tim
    Commented Jan 31, 2013 at 2:22
  • 3
    $\begingroup$ @Tim Imagine a decision maker that does not know the true state $\omega\in\Omega$, but only observes a signal $f(\omega)$. In each state $\omega$, she takes some decision $d(\omega)$. Since she does not know the true state, her decision has to depend only on the signal. So it should be of the form $d=g\circ f$ gor some $g$. The Doob-Dynkin lemma establishes that in the measurable framework under the given assumptions, this is exactly the same as $d$ being $\sigma(f)$-measurable. $\endgroup$ Commented Jan 31, 2013 at 7:48
  • $\begingroup$ And the decision $d$ is dependant on that it get the right info i.e which exact $\omega$ that occured? $\endgroup$
    – user415535
    Commented Sep 6, 2017 at 5:18
  • $\begingroup$ @user21312 No, you only observe the value $f(\omega)$. $\endgroup$ Commented Sep 6, 2017 at 5:46
  • 1
    $\begingroup$ @user21312 No, that is pretty much it. Suppose some normally distributed number is drawn and you only see its sign. Making your decision only depend on the sign amounts to choosing according to the actual value but requiring making the same choice for any two numbers with the same sign. $\endgroup$ Commented Sep 6, 2017 at 6:51
7
$\begingroup$

It seems reasonable to define "the information represented by $X$" as the following collection of events: $$ \mathcal{I} := \{A \in \mathcal{F}: \mathbf{1}_B(X(\omega)) =\mathbf{1}_A(\omega) \text{ for some $B \in \mathcal{B}$}\} $$ In other words, an event is in $\mathcal{I}$ if and only if we can identify its occurrence by observing whether or not $X$ falls into a particular set. It is straight-forward to show that $\sigma(X) = \mathcal{I}$ by using the Doob-Dynkin Lemma.

$\endgroup$
4
  • $\begingroup$ So the information is the occurance of some events? $\endgroup$
    – user415535
    Commented Sep 6, 2017 at 5:14
  • $\begingroup$ The information is all the events we can possibly learn about from observing X $\endgroup$
    – David
    Commented Sep 6, 2017 at 13:06
  • $\begingroup$ Ok, and learing about an event is if it occured or not ? I posted a question math.stackexchange.com/questions/2418679/… a few hours ago after reading this. cheack it out if you like. $\endgroup$
    – user415535
    Commented Sep 6, 2017 at 13:14
  • 1
    $\begingroup$ I think the last step $\sigma(X)=\mathcal{I}$ can be proved through definition of $\sigma$-algebra generated by a random variable, without using any lemma. $\endgroup$
    – PPP
    Commented Mar 26 at 14:13
0
$\begingroup$

I have been pointed out about this thread, in a thread I opened: Sigma-algebras and relationships? I believe there can be some relationship between both threads, so for this reason I post here. Thanks.

$\endgroup$

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .