3
$\begingroup$

Let $( \mathbb{P}_t )_{ t \geq 0 }$ be a family of probability measures on the measurable space $(\mathbb{R}, \mathcal{B}(\mathbb{R}))$ such that $$ \int_{\mathbb{R}} x^2 \mathbb{P}_t (dx) < \infty \quad \forall t \geq 0, $$ and assume that the mapping $$ [0, \infty) \ni t \mapsto \mathbb{P}_t $$ is continuous with respect to the $2$nd Wasserstein metric denoted by $W_2(\cdot, \cdot)$.

Now let $f : \mathbb{R} \times [0, \infty) \rightarrow [0, \infty]$ be a measurable function.

Is it true that the mapping $$ [0, \infty) \ni t \mapsto \int_{\mathbb{R}} f(x,t) \mathbb{P}_t(dx) \tag{1} $$ is measurable, i.e., $\mathcal{B}([0, \infty))-\mathcal{B}([0, \infty])$-measurable?

Clearly, Tonelli's theorem would imply that for every $s \in [0, \infty)$ the mapping $$ [0, \infty) \ni t \mapsto \int_{\mathbb{R}} f(x,t) \mathbb{P}_{\color{red}s}(dx) \tag{1} $$ satisfies the aforementioned measurability. But what can be said about the case above? Does one need additional assumptions on the function $f$? Is it maybe possible to show that the mapping in $(1)$ is continuous?


Some further thoughts:

Assume that the stochastic process $(X_t)_{t \geq 0}$ has (right-)continuous sample paths and the law of $X_t$ is given by $\mathbb{P}_t$, $t \geq 0$. The mapping $$ [0, \infty) \times \Omega \ni (t, \omega) \mapsto X_t (\omega) $$ is jointly measurable. We can further observe that the mapping $$ [0, \infty) \times \Omega \ni (t, \omega) \mapsto ( X_t (\omega), t) $$ is also jointly measurable (since every component is jointly measurable). It then follows that the composition $f(X_t(\omega), t)$ is jointly measurable. We can then apply Tonelli's theorem to conclude that $$ [0, \infty) \ni t \mapsto \mathbb{E}[f(X_t, t)] = \int_{\mathbb{R}} f(x,t) \mathbb{P}_t(dx) $$ is measurable. So the question is whether such a stochastic process always exists. Could the continuity in the Wasserstein metric (which also implies weak convergence) be helpful?

$\endgroup$

2 Answers 2

1
+50
$\begingroup$

Yes, the function $F_f:[0,\infty)\to\mathbb{R}$; $$F_f(t)=\mathbb{E}_{Y\sim\mathbb{P}_t}[f(Y,t)]$$ is measurable without any further conditions.

Let $\mathcal{M}=\{f:F_f\text{ is measurable}\}$. I claim:

  1. $\mathcal{M}$ is an $\mathbb{R}$-vector space.
  2. $\mathcal{M}$ is closed under monotone-increasing pointwise limits.
  3. $\mathcal{M}$ contains all continuous functions of compact support.

By the monotone class theorem, (1-3) imply that $\mathcal{M}$ contains the Baire-measurable functions, which (since $\mathbb{R}$ is $\sigma$-compact) are precisely the Borel-measurable ones too.


Digression: that conclusion is also easy to check "by hand".

Start by noticing that $x\mapsto(1-|x|^n)1_{B(0,1)}(x)$ is continuous; taking the (monotone) supremum, $1_{B(0,1)}\in\mathcal{M}$. More generally, (2-3) imply that $\{1_{B(x,r)}:x\in\mathbb{R}^2,r\in\mathbb{R}^+\}\subseteq\mathcal{M}$.

Now each open set in $\mathbb{R}^2$ is a disjoint union of balls (connected components), so its characteristic function is a (countable, pointwise) sum of functions of the form $1_{B(x,r)}$. By (1), for any open, bounded $U$ with finitely many connected components, $1_U\in\mathcal{M}$; for the general case, apply (2).

Thus $1\in\mathcal{M}$; subtracting via (1) gives us complementation. Applying complementation, we have the characteristic function of any closed set; applying complementation and (2) $\omega_1$-many times, $\mathcal{M}$ contains the characteristic functions of Borel sets.


Returning to my claims above:

To prove (1), use linearity of expectation.

To prove (2), use the Monotone Convergence Theorem: if $f=\sum_n{g_n}$ with $g_n\geq0$, then $F_f=\sum_n{F_{g_n}}$.

To prove (3), suppose $f$ is continuous and has compact support $K$. By Stone-Weierstrass, there exists two (double) sequences of continuous functions $\{X_{n,j}\}_{(j=1)\times(n=1)}^{J_n\times\infty}$ and $\{T_{n,j}\}_{(j=1)\times(n=1)}^{J_n\times\infty}$ such that

  1. uniformly in $x$ and $t$, $$f(x,t)=\lim_{n\to\infty}{\sum_{j=1}^{J_n}{X_{n,j}(x)T_{n,j}(t)}}$$ and
  2. each $X_{n,j}$ is Lipschitz.

From (1), $$F_f(t)=\lim_{n\to\infty}{\mathbb{E}_{Y\sim\mathbb{P}_t}\left[\sum_{j=1}^{J_n}{X_{n,j}(Y)T_{n,j}(t)}\right]}=\lim_{n\to\infty}{\sum_{j=1}^{J_n}{T_{n,j}(t)}\mathbb{E}_{Y\sim\mathbb{P}_t}[X_{n,j}(Y)]}$$ Measurable functions are closed under algebra and limits, and each $T_{n,j}$ is continuous, so it suffices to show $F_{X_{n,j}}(t)=\mathbb{E}_{Y\sim\mathbb{P}_t}[X_{n,j}(Y)]$ is measurable.

In fact, $F_{X_{n,j}}$ is continuous. To see this, let $X_{n,j}$ have Lipschitz constant $L$. Then \begin{align*} (\mathbb{E}_{Y\sim\mathbb{P}_t}[X_{n,j}(Y)]-\mathbb{E}_{Y\sim\mathbb{P}_s}[X_{n,j}(Y)])^2&=\mathbb{E}_{\substack{Y_1\sim\mathbb{P}_t\\Y_2\sim\mathbb{P}_s}}[X_{n,j}(Y_1)-X_{n,j}(Y_2)]^2 \\ &\leq L^2\mathbb{E}_{\substack{Y_1\sim\mathbb{P}_t\\Y_2\sim\mathbb{P}_s}}[Y_1-Y_2]^2 \\ &=L^2\left(\inf_{\substack{\text{couplings }(Y_1,Y_2) \\Y_1\sim\mathbb{P}_t\\Y_2\sim\mathbb{P}_s}}{\mathbb{E}[Y_1-Y_2]}\right)^2 \tag{a} \\ &\leq L^2\inf_{\substack{\text{couplings }(Y_1,Y_2) \\Y_1\sim\mathbb{P}_t\\Y_2\sim\mathbb{P}_s}}{\mathbb{E}[(Y_1-Y_2)^2]} \tag{b} \\ &=L^2W_2(\mathbb{P}_t,\mathbb{P}_s)^2 \end{align*} where (a) follows from linearity of expectation (the value inside the infimum doesn't depend on the coupling) and (b) from Jensen's inequality. As $t\to s$, $W_2(\mathbb{P}_t,\mathbb{P}_s)\to0$, and the claim follows.

$\endgroup$
10
  • $\begingroup$ Could you clarify how you show measurability in part 2 without any additional assumptions on $f$? $\endgroup$
    – Holden
    Commented May 15, 2022 at 15:49
  • $\begingroup$ @Holden: I don't; in part 2, I reduce to the case where $f$ has compact support. $\endgroup$ Commented May 18, 2022 at 18:35
  • $\begingroup$ Could you elaborate on the part with Baire hierarchy? If $f(x, t)=f(x)=1_A(x)$ for a Borel measurable set $A$, how do you proceed in this case to show that $t \mapsto \int_{\mathbb{R}} 1_A(x) \mathbb{P}_t(dx)=\mathbb{P}(Y_t \in A)$ is measurable? $\endgroup$
    – Holden
    Commented May 19, 2022 at 10:56
  • $\begingroup$ @Holden: I've rewritten this solution to use the monotone class theorem. Is that any clearer? $\endgroup$ Commented May 20, 2022 at 0:33
  • $\begingroup$ Do you make use of the continuity w.r.t. the Wasserstein metric? If not, isn't this a counterexample? Let $N \subset [0, \infty)$ be a set which is not Borel measurable, and consider the function $[0, \infty) \ni t \mapsto 1_N(t)$. Let $(Y_t)_{t \geq 0}$ be random variables such that $\mathbb{P}(Y_t = 1) = 1_N(t)$. Then $t \mapsto \mathbb{P}(Y_t =1)$ is not measurable. $\endgroup$
    – Holden
    Commented May 20, 2022 at 8:52
0
$\begingroup$

Caution: extremely soft answer.

Take a probability measure $\mu$ on $[0,\infty)$. Assume the distribution of $X,T$ is $\mu(dt)\mathbb P_t(dx)$, i.e. such that $\mathbb E[h(X,T)] = \int_t\left(\int_xh(x,t)\mathbb P_t(dx)\right)\mu(dt)$ for every measurable $h$. For us to be able to define it, we need your statement to be true.

This is an ubiquitous construction in probability theory. So there should be a probability textbook where your statement is proved as a basic fact in the beginning of a chapter about transition kernels or random measures. I'll try and look it up.

$\endgroup$
3
  • $\begingroup$ Oops, sorry, I think you should remove the integral symbol. I meant a conditional measure: choose $T$ according to $\mu(dt)$, then conditional on $T$, choose $X$ according on $P_T(dx)$. $\endgroup$
    – justt
    Commented May 15, 2022 at 10:20
  • $\begingroup$ Oooh, just realized that to define the setting I mentioned, you need your result to hold. $\endgroup$
    – justt
    Commented May 15, 2022 at 10:27
  • $\begingroup$ I have added further thoughts to the post. $\endgroup$
    – Holden
    Commented May 16, 2022 at 22:04

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .