1
$\begingroup$

Let $\{Y_j\}_1^\infty$ be i.i.d. real random variables on a common probability space. $\forall n\in\mathbb{N}$, define $X_n = x_0 + \sum_{j=1}^nY_j$, where $x_0$ is a constant. Also define $X_0=x_0$. Let $\mathcal{F}_n=\mathcal{F}_n^Y$ be the natural filtation of the $Y_j$'s.

I want to show $(X_n)$ is a Markov process with respect to $\mathcal{F}_n$ with transition probability given by $$p(x,A)=P(x+Y\in A)$$, where $x$ is a real number, $A$ is a Borel set, and $Y$ is any one of the $Y_j$'s.

I have verified $p(\cdot,\cdot)$ is a valid transition probability, but I don't know how to show it satisfies the following Markov property:

$$P(X_{n+1}\in A | \mathcal{F}_n)(\omega)=p(X_n(\omega),A)$$ for almost every $\omega$ for all $n$.

For discrete $Y_j$'s, this can be done through conditioning on the values of the $Y_j$'s, like how it is done for Markov chains. I have tried to approximate general random variables with discrete ones using simple functions, but this method does not seem to work because the approximations of the $Y_j$'s might not be i.i.d.

$\endgroup$

2 Answers 2

1
$\begingroup$

By $Ind(A)$ I mean $\chi_A$ (indicator function)

$\mathbb P ( X_{n+1} \in A | \mathcal F_n ) = \mathbb E [ Ind( X_{n+1} \in A ) | \mathcal F_n ] = \mathbb E [ Ind (X_n + Y_{n+1} \in A) | \mathcal F_n]$

Now note that $\mathbb E [ f(X,Y) | \mathcal G] = h(X)$, where $h(t) = \mathbb E[f(t,Y)]$, when $X$ is $\mathcal G$ measurable and $Y$ is independent of $\mathcal G$.

By that we have: $\mathbb E [ Ind (X_n + Y_{n+1} \in A) | \mathcal F_n] = h(X_n)$, where $h(t) = \mathbb E[ Ind(t + Y_{n+1}) \in A ] = \mathbb P ( t+ Y_{n+1} \in A) = p(t,A)$

So $h(X_n) = p(X_n,A)$

That leads us to: $\mathbb P ( X_{n+1} \in A | \mathcal F_n) = p(X_n,A)$ almost surely

Edit: Maybe I should clarify: $X_n$ is $\mathcal F_n = \sigma (Y_1,...,Y_n)$ measurable and $Y_{n+1}$ is independent of $\mathcal F_n$ (since $\{Y_k\}$ were independent )

$\endgroup$
2
  • 1
    $\begingroup$ Thank you so much. I just want to add here that $\mathbb E [ f(X,Y) | \mathcal G] = h(X)$ can be proved by reduction to indicator functions of rectangles. $\endgroup$ Commented Feb 17, 2020 at 18:02
  • 1
    $\begingroup$ Exactly, or (since we need to prove for $A \in \mathcal G$ that $\mathbb E[f(X,Y) \chi_A] = \mathbb E[h(X) \chi_A])$ taking your problem to more dimensions, defining new random vector $(X,Y,\chi_A)$ and observing that due to independence $(X,\chi_A)$ and $Y$ its distribution is $\mu_{(X,\chi_A)} \otimes \mu_Y$ and playing with Fubini $\endgroup$ Commented Feb 17, 2020 at 19:09
0
$\begingroup$

Let $n\geqslant m$ be nonnegative integers and $f$ a bounded measurable function. Then \begin{align} \mathbb E[f(X_n)\mid \mathcal F_m] &= \mathbb E\left[f\left(x_0+\sum_{j=1}^n Y_j \right)\mid\mathcal F_m\right]\\ &=\mathbb E\left[f\left(x_0 + \sum_{j=1}^m Y_j + \sum_{j=m+1}^n Y_j \right)\mid \mathcal F_m\right]\\ &= \mathbb E\left[f\left(x_0 + \sum_{j=1}^m Y_j + \sum_{j=m+1}^n Y_j \right)\mid \mathcal \sigma(X_m)\right], \end{align} as $x_0$ is a constant, $\sum_{j=1}^m Y_j$ is $\sigma(X_m)$-measurable, and $\sum_{j=m+1}^n Y_j$ is independent of $\mathcal F_m$. It follows that $\{X_n\}$ is a Markov process.

$\endgroup$

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .