Skip to main content

All Questions

8 votes
2 answers
491 views

Conceptual Issues in the Measure Theoretic Proof of Conditional Expectations (via Radon-Nikodym)

I have been looking into measure theory (from a probabilist's perspective), and I have found the proof of the existence of the conditional expectation to feel a little "glossed over" in ...
tisPrimeTime's user avatar
8 votes
1 answer
4k views

Radon-Nikodym-derivative as a martingale

At the beginning of all the stuff about Girsanov theorem, we introduced the Radon-Nikodym derivative as $Z_\infty := \frac{d \mathbb{Q}}{d \mathbb{P}}\vert_{\mathcal{F}_\infty}$. Next, we considered ...
tubmaster's user avatar
  • 728
8 votes
5 answers
2k views

Conditional expectation of independent variables

Claim. Let $Z_1, Z_2$ be two independent and identically distributed random variables. Then we have: $$ \mathbb E[Z_1|Z_1+Z_2] =\frac{Z_1+Z_2}{2}. $$ Proof. To see this, I have proceeded as follows. ...
RandomGuy's user avatar
  • 1,407
8 votes
1 answer
2k views

Conditional expectation of $X$ given $|X|$

Let $X$ be in integrable, with density $f$ with respect to the Lebesgue measure. Compute the conditional expectation : $ \operatorname{E} \left[ X\, \Big|\, |X| \,\right] $ My ansatz was : $ \...
fred00's user avatar
  • 83
8 votes
1 answer
2k views

Conditional expectation and independence on $\sigma$-algebras and events

In many statistics papers, proofs might proceed as follows: Under the event $A$, the random variables $X$ and $Y$ are independent. (Often this means that on $A^C$, they might be dependent). Then some ...
air's user avatar
  • 2,822
8 votes
2 answers
199 views

Find the conditional expectation $E(U\mid \min(U,1−U))$ where $U∼U[0,1]$

Assume that $U$ is a random variable on $(Ω,\mathcal{F},P)$ with $U∼U[0,1]$. Let $Y= \min(U,1−U)$. I am asked to find $E(U\mid Y)$. I have figured out a proof for this and I want some sanity check. My ...
NamelessGods's user avatar
8 votes
1 answer
7k views

Polya's urn (martingale)

Suppose you have an urn containing one red ball and one green ball. You draw one at random; if the ball is red, put it back in the urn with an additional red ball , otherwise put it back and add a ...
TripleX's user avatar
  • 273
8 votes
2 answers
376 views

Show that $\mathbb{E}\left(\bar{X}_{n}\mid X_{(1)},X_{(n)}\right) = \frac{X_{(1)}+X_{(n)}}{2}$

Let $X_{1},\ldots,X_{n}$ be i.i.d. $U[\alpha,\beta]$ r.v.s., and let $X_{(1)}$ denote the $\min$, and $X_{(n)}$ the $\max$. Show that $$ \mathbb{E}\left(\overline{X}_{n}\mid X_{(1)},X_{(n)}\right) = ...
GurrVasa's user avatar
  • 433
8 votes
1 answer
290 views

Show that $E(X\mid Y, Z) = E(X\mid Y)$ almost surely with condition Z is independent of $(X, Y)$

$(X, Y, Z)$ is a continuous random vector and $Z$ is independent of $(X,Y)$. Prove that $E(X\mid Y, Z) = E(X\mid Y)$ almost surely. I had been thinking this question tonight but couldn't figure out ...
eeeethan1997's user avatar
8 votes
3 answers
580 views

How can two seemingly identical conditional expectations have different values?

Background Suppose that we are using a simplified spherical model of the Earth's surface with latitude $u \in (-\frac {\pi} 2, \frac {\pi} 2)$ and longitude $v \in (-\pi, \pi)$. Restricting attention ...
Ethan Mark's user avatar
  • 2,187
8 votes
0 answers
242 views

Regular conditional probability on Polish space and absolute continuity

Let $(\Omega,\mathcal F,\mathbb P)$ is a standard Borel space (i.e. $\Omega$ is Polish and $\mathcal F = \mathcal B(\Omega)$). Then $\mathcal F$ is separable and for every sub-sigma-algebra $\mathcal ...
Cyril B.'s user avatar
  • 115
7 votes
2 answers
3k views

Holder conditional inequality

we consider, on a probability space $(\Omega,\mathcal{A},P)$, two random variable $X$ and $Y$ and let $\mathcal{H} \subset \mathcal{A}$ be a $\sigma$-algebra. Let $p,q>1$ such that $\frac{1}{p}+\...
user avatar
7 votes
1 answer
651 views

Rigorous definition of the conditional expectations $E(X|Y=y)$ when $P(Y=y)=0$

Let $X$ be an integrable random variable on $(\Omega, \mathfrak A, P)$. I've learned that for an event $A$ of non-zero probability, $$ E(X|A) = \int X(\omega) \,dP(\omega|A) = \frac{1}{P(A)}\int_A X ...
Epiousios's user avatar
  • 3,246
7 votes
1 answer
2k views

Joint densities and conditional densities of sums of i.i.d. normally distributed random variables

Let $X_1,X_2,…$ be independent with the common normal density $\eta$, and $S_k= X_1+⋯+X_k$. If $m <n$ find the joint density of $(S_m,S_n)$ and the conditional density for $S_m$ given that $S_n=t$. ...
Comic Book Guy's user avatar
7 votes
4 answers
905 views

Why is $E[X|X+Y] = E[Y |X+Y]$ if X,Y are i.i.d random variables

In proof of the fact that $E[X|X+Y] = \frac{X+Y}{2}$ when $X,Y$ are independent, identically distributed random variables, one uses the observation that $E[X|X+Y] = E[Y|X+Y]$ but I don't see why this ...
hugo's user avatar
  • 117

15 30 50 per page