1
$\begingroup$

Motivating Question: Let $X$,$Y$ be independent standard uniform random variables. How does one show, rigorously, that $$ \mathbb{E}[X \mid X+Y = 1] = \frac{1}{2}? $$

I would be interested in hearing answers both from the 'probabilistic' view of integration, and the 'geometric' view of it. (Does the probabilist need to know that $\mathbb{R}^2$ is a metric space in order to do the integral? Do they need to know that it is a Riemannian manifold? Does the geometer need to know how Lebesgue measure is constructed on $\mathbb{R}$?)

Semi-rigorous attempt: The conditional measure is supported on the line $L$ given by $x+y=1$. The quotient $$ \frac{\displaystyle \int_{y=0}^{1} \int_{x=1-y-\epsilon}^{1-y+\epsilon} x \, dx \, dy} {\displaystyle \int_{x=1-y-\epsilon}^{x=1+y+\epsilon} 1 \, dx} = \frac{\epsilon}{2 \epsilon} =\frac{1}{2}, $$ as required. Unfortunately, this limit is only a very specific sequence of "converging neighbourhoods" to $L$ (also the limits are not quite right near the boundary of the square but this isn't the main issue).

Edit: For me, the best answers would be more general than simply providing an answer to the motivating question. I am aware that a symmetry trick plus the observation $\mathbb{E}[X + Y | X+Y = 1] =\mathbb{E}[1 | X+Y =1] = 1 = \mathbb{E}[X | X+Y = 1] + \mathbb{E}[Y | X+Y = 1]$ solves the problem. Perhaps a general discussion of how conditioning on an event of probability zero is even defined would help.

$\endgroup$
5
  • 3
    $\begingroup$ Hint: It's the same as $\mathbb E[Y\,|\,X+Y=1]$. $\endgroup$
    – lulu
    Commented Nov 17, 2022 at 22:52
  • $\begingroup$ The key here is the notion of exchangeability. $\endgroup$
    – jlammy
    Commented Nov 17, 2022 at 22:54
  • $\begingroup$ @lulu yes indeed, but this isn't really the answer I am looking for (perhaps I could have explained what I am looking for better - I edited the question in an attempt to do so) $\endgroup$ Commented Nov 17, 2022 at 23:02
  • $\begingroup$ Re conditioning on events of measure $0$ (and more generally, conditional expectation as a whole) I think this pdf is great $\endgroup$
    – jlammy
    Commented Nov 17, 2022 at 23:04
  • 1
    $\begingroup$ I truly do not understand why there is a differential geometric tag here. Yes, a probabilist needs to know the standard Riemann/Lebesgue integral in $\Bbb R^2$. Where's any differential geometry? $\endgroup$ Commented Nov 18, 2022 at 6:28

1 Answer 1

1
$\begingroup$

Ignoring the issues of integrating outside the square, your approach computes $\frac{E(XI(1 - \varepsilon \leq X + Y \leq 1 + \varepsilon))}{P(1 - \varepsilon \leq X + Y \leq 1 + \varepsilon)} = E(X \mid 1 - \varepsilon \leq X + Y \leq 1 + \varepsilon)$. It seems reasonable that this converges to $E(X \mid X + Y = 1)$ as $\varepsilon \to 0$, but this strategy doesn't always work, see https://en.wikipedia.org/wiki/Borel%E2%80%93Kolmogorov_paradox.

I think that a probabilist would use a different way to compute $E(X \mid X + Y = 1)$. He might first compute $$f_X(x \mid X + Y = 1) = \frac{f_{X + Y}(1 \mid X = x)f_X(x)}{f_{X + Y}(1)} = \frac{1}{\int_{0}^{1}1\,dx} = 1,$$ and conclude that $$E(X \mid X + Y = 1) = \int_{0}^{1}xf_X(x \mid X + Y = 1)\,dx = \int_{0}^{1}x\,dx = \frac{1}{2}.$$

$\endgroup$

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .