Motivating Question: Let $X$,$Y$ be independent standard uniform random variables. How does one show, rigorously, that $$ \mathbb{E}[X \mid X+Y = 1] = \frac{1}{2}? $$
I would be interested in hearing answers both from the 'probabilistic' view of integration, and the 'geometric' view of it. (Does the probabilist need to know that $\mathbb{R}^2$ is a metric space in order to do the integral? Do they need to know that it is a Riemannian manifold? Does the geometer need to know how Lebesgue measure is constructed on $\mathbb{R}$?)
Semi-rigorous attempt: The conditional measure is supported on the line $L$ given by $x+y=1$. The quotient $$ \frac{\displaystyle \int_{y=0}^{1} \int_{x=1-y-\epsilon}^{1-y+\epsilon} x \, dx \, dy} {\displaystyle \int_{x=1-y-\epsilon}^{x=1+y+\epsilon} 1 \, dx} = \frac{\epsilon}{2 \epsilon} =\frac{1}{2}, $$ as required. Unfortunately, this limit is only a very specific sequence of "converging neighbourhoods" to $L$ (also the limits are not quite right near the boundary of the square but this isn't the main issue).
Edit: For me, the best answers would be more general than simply providing an answer to the motivating question. I am aware that a symmetry trick plus the observation $\mathbb{E}[X + Y | X+Y = 1] =\mathbb{E}[1 | X+Y =1] = 1 = \mathbb{E}[X | X+Y = 1] + \mathbb{E}[Y | X+Y = 1]$ solves the problem. Perhaps a general discussion of how conditioning on an event of probability zero is even defined would help.