Skip to main content

All Questions

17 votes
2 answers
5k views

Conditional expectation equals random variable almost sure

Let $X$ be in $\mathfrak{L}^1(\Omega,\mathfrak{F},P)$ and $\mathfrak{G}\subset \mathfrak{F}$. Prove that if $X$ and $E(X|\mathfrak{G})$ have same distribution, then they are equal almost surely. I ...
Marc's user avatar
  • 2,094
14 votes
1 answer
12k views

Independence and conditional expectation

So, it's pretty clear that for independent $X,Y\in L_1(P)$ (with $E(X|Y)=E(X|\sigma(Y))$), we have $E(X|Y)=E(X)$. It is also quite easy to construct an example (for instance, $X=Y=1$) which shows that ...
user73048's user avatar
  • 299
11 votes
1 answer
1k views

Does almost sure convergence and $L^1$-convergence imply almost sure convergence of the conditional expectation?

Question. Let $ X_{n}, X $ be random variables on some probability space $ ( \Omega, \mathcal{F},\mathbb{P} ) $ and let $ \mathcal{G} \subset \mathcal{F} $ be a sub-$\sigma$-algebra. Moreover ...
Pass Stoneke's user avatar
10 votes
1 answer
392 views

What am I writing when I write $\mathbf X \mid \mathbf Y$?

Suppose $\mathbf X$ is a random variable and $A$ is an event in the same probability space $(\Omega, \mathcal F, \Pr)$. (Formally, $\mathbf X$ is a function on $\Omega$, say $\Omega \to \mathbb R$; $A$...
Misha Lavrov's user avatar
7 votes
4 answers
905 views

Why is $E[X|X+Y] = E[Y |X+Y]$ if X,Y are i.i.d random variables

In proof of the fact that $E[X|X+Y] = \frac{X+Y}{2}$ when $X,Y$ are independent, identically distributed random variables, one uses the observation that $E[X|X+Y] = E[Y|X+Y]$ but I don't see why this ...
hugo's user avatar
  • 117
5 votes
1 answer
388 views

Applying a formula like $\mathbb{E}[ X \vert \sigma(\mathcal{F},\mathcal{G}) ] = \mathbb{E}[X \vert \mathcal{G}]$

Let $(Y_1, \ldots, Y_n)$ be a $[0,1]^n$-valued random vector and $U_1, \ldots, U_n$ independent random variables, uniformly distributed on $[0,1]$, and independent of $(Y_1, \ldots, Y_n)$. For some ...
user98187609's user avatar
5 votes
1 answer
132 views

Sigma-algebra generated by conditional expectation

I am dealing with the following question: given two dependent random variables $X_1,X_2$, I am wondering whether the following equivalence for the generated sigma-algebras holds: $$\sigma(X_1)=\sigma(...
G.Rossi's user avatar
  • 95
5 votes
1 answer
2k views

Conditional expectation constant on part of partition

I have a question about conditional expectation, while looking for the answer here on stackexchange I noticed that there are a few different definitions used, so I will first give the definitions I ...
Anand's user avatar
  • 51
5 votes
1 answer
1k views

Bounding tail conditional expectation of a random variable given variance

Given a random variable $X$ with CDF $F(X)$, mean $E(X)=0$, and variance $Var(X) =\sigma^2$, I would like to bound the tail conditional expectation where $X$ is in the tail with probability $1-p$: $E(...
dhz's user avatar
  • 621
5 votes
1 answer
973 views

Compute the conditional expectation $E(Y|X)$ for a measurable function $Y$ and a random variable $X$ taking values on $[0,1)$

Good day, Currently I am working with "Probability: Theory and Examples" by Durrett and while getting familiar with conditional expectations I got to this problem: Consider the Lebesgue ...
Cahn's user avatar
  • 4,483
5 votes
0 answers
271 views

Sigma algebra generated by a homeomorphic random variable

Let $\Omega = [0,1]$ be our probability space with sigma algebra of borel sets on $[0,1]$ and Lebesgue measure on $[0,1]$. Let Y be a random variable such that $Y(\omega) = Y(1-\omega)$ for every $\...
Hagrid's user avatar
  • 2,601
4 votes
1 answer
894 views

Why $E[X|\mathcal{G}]=X$ if $X$ is $\mathcal{G}$-measurable?

If $X$ is a $\mathcal{G}$-measurable random variable, why $E[X|\mathcal{G}] = X$? I know the intuition (basicly we're conditioning on the same informations on which $X$ is defined, $\sigma(X)$, we can'...
mick94's user avatar
  • 41
4 votes
2 answers
130 views

Calculating a Conditional expectation

My question is the following. Given that we have $n$ i.i.d. random variables $X_1,...,X_n$ with distribution $f(x)=\frac{2}{\lambda^2}x\mathbf{1}_{[0,\lambda]}(x)$, where $\lambda> 0$ is some ...
Maximilian's user avatar
4 votes
1 answer
320 views

Doob's Optional Stopping Theorem to find probabilities of stopping times

Suppose we have a simple random walk starting from $S_0=0$, and $S_n=X_1+\dots+X_n$ such that $$\mathbb{P}(X_i=1)=p \hspace{1em}\mathbb{P}(X_i=0)=r \hspace{1em} \mathbb{P}(X_i=-1)=q$$ for positive $p,...
Milly Moo's user avatar
  • 115
4 votes
1 answer
1k views

Proving $V(X) = E(V(X|Y)) + V(E(X|Y))$ using the pythagorean theorem

I know the textbook proof of $$V(X) = E(V(X|Y)) + V(E(X|Y))$$ but I'm interested in understanding the weird proof/analogy with the pythagorean theorem my professor gave in class. With $X, Y$ random ...
Winter's user avatar
  • 946

15 30 50 per page
1
2 3 4 5
9