All Questions
37
questions
0
votes
0
answers
15
views
Does is hold that $E[f(X_t)1_{\{ s \geq T_1\}}| \mathcal F_{T_1}] = P_{t-T_1}(X_{T_1})1_{s \geq T_1}$ for $s\leq t$ and X is a strong Markov process
Let $X=(X_t)_{t\geq 0}$ be a homogeneous cadlag Markov process taking values in a finite state space $S$. Let $T_1$ be its first jump time and $f$ be a bounded measurable function. I would like to ...
2
votes
1
answer
48
views
A cadlag Feller process for $\mathcal F$ is Markov w.r.t $\mathcal F_+$ (Th. 46, Chap. 1, Stochastic Integration - Protter)
In page 35 of the book Stochastic Integration by P. Protter, he defines a Feller process as follows:
Then he states the following theorem.
In the proof, he used the following strategy:
Next, he ...
1
vote
1
answer
47
views
Decomposing a general stopping time into stopping components
Let $(X_n)_{n \geq 0}$ be a discrete-time Markov chain taking values in a finite state space $S$, with transition matrix $P$. Let $(\mathcal F_n)_{n\geq 0}$ be the natural filtration and let $\tau \...
5
votes
1
answer
85
views
Sum of conditional expectations of a bounded stochastic process
Is there a proof for the following statement or is there a counter-example?
Let $\{X_t\}$ be a stochastic process
adapted to the filtration $\{\mathcal{F}_t\}$.
Assuming $0 \leq X_t \leq 1$,
and $\...
3
votes
0
answers
48
views
Is my proof of Markov Property for Reflected BM correct?
I want to show that $|B_{t}|$ is a Markov Process where, $B_{t}$ is a Standard Brownian Motion. I have seen the proof here and here. But I don't understand why the method below might fail (or if it's ...
1
vote
1
answer
47
views
Interpretation of condition probability of $X_0 = x$ for a Markov process and semigroup $(P_t)$ [closed]
When studying Markov processes, I have seen a lot of authors define the semigroup as $P_tf(x) = \mathbb E_x(f(X_t))$ (with the assumption that $X_t$ is homogeneous) and the call $\mathbb E_x$ as the &...
1
vote
1
answer
88
views
How can we rigorously show conditional independence here?
Let
$(E,\mathcal E,\lambda)$ be a measure space;
$p:E\to[0,\infty)$ be $\mathcal E$-measurable with $$c:=\int p\:{\rm d}\lambda\in(0,\infty)$$ and $\mu$ denote the measure with density $\frac pc$ ...
0
votes
1
answer
51
views
Equivalence of defintion for Markov Property
I've seen definitions of Markov Property for a process $X$ indexed with the positive integers with values in $S$, that are supposedly equavalent. I consider the canonical filtration $\mathcal F=(\...
2
votes
0
answers
91
views
Conditional distribution of Gaussian process completely determined by conditional expectation
I am reading the book Stochastic Processes by J. L. Doob and trying to understand the argument that, for any (real-valued) Gaussian process $X = (X_t)_{t\ge0}$, the Markov property is characterized by ...
1
vote
0
answers
22
views
Markov property of $X_t + \sigma Y_t$ when $\sigma \rightarrow 0$
Let $(X_t)$ and $(Y_t)$ be sample continuous stochastic processes on $[0,1]$ such that $Z_t (\sigma):= X_t + \sigma Y_t$, where $\sigma >0$, is Markov with regard to the filtration generated by $...
2
votes
1
answer
238
views
Showing an equivalence between the martingale property and a markov property.
I really am not sure how to get a rigorous answer to the following, any help would be greatly appreciated.
Let $(X_n)_{n\geq0}$ be an integrable process, taking values in a countable set $E ⊆ \mathbb{...
3
votes
2
answers
198
views
Markov transition kernels of process with independent increments
Suppose that $\{X_t : Ω → S := \mathbb{R}^d, t\in T\}$ is a stochastic
process with independent increments and let $\mathcal{B}_t :=\mathcal{B}_t^X$ (natural filtration) for all $t\in T$. Show, for ...
1
vote
0
answers
66
views
Is the Markov property under $\mathbb{P} $ preserved under change to a measure $\mathbb{Q} $ absolutely continuous to $\mathbb{P} $ .
Let $(\Omega,\mathcal{F}, \{\mathcal{F}_n \}, \mathbb{P})$, be a filtered probability space, $\mathcal{F}= \sigma\{F_n,n\in \mathbb{N}\} $, $M_n$ a nonnegative martingale, and $\mathbb{E} \mathrm{M}_n=...
1
vote
1
answer
92
views
Relation between the strong Markov property of a process and the strong Markov property of the associated canonical process on the path space
Let
$(\Omega,\mathcal A,\operatorname P)$ be a probability space;
$(E,\mathcal E)$ be a measurable space;
$\pi_I$ denote the projection from $E^{[0,\:\infty)}$ onto $I\subseteq[0,\infty)$ and $\pi_t:=...
1
vote
0
answers
27
views
If $(Y_n)$ is iid, then $Z_n:=\sum_{i=1}^nY_i$ is Markov
Let $(\Omega,\mathcal A,\operatorname P)$ be a probability space, $(\mathcal F_n)_{n\in\mathbb N_0}$ be a filtration on $(\Omega,\mathcal A,\operatorname P)$, $E$ be a $\mathbb R$-Banach space, $(Y_n)...
1
vote
2
answers
380
views
Sum of i.i.d. random variables is a Markov process
Let $\{Y_j\}_1^\infty$ be i.i.d. real random variables on a common probability space.
$\forall n\in\mathbb{N}$, define $X_n = x_0 + \sum_{j=1}^nY_j$, where $x_0$ is a constant. Also define $X_0=x_0$. ...
9
votes
2
answers
426
views
If $Y\sim\mu$ with probability $p$ and $Y\sim\kappa(X,\;\cdot\;)$ otherwise, what's the conditional distribution of $Y$ given $X$?
Let
$(\Omega,\mathcal A,\operatorname P)$ be a probability space
$(E,\mathcal E)$ be a measurale space
$\mu$ be a probability measure on $(E,\mathcal E)$
$X$ be an $(E,\mathcal E)$-valued random ...
0
votes
0
answers
90
views
If $X$ is a Feller process, then $\sup_{x\in E}\text E\left[d(X_s,X_t)\wedge1\mid X_0=x\right]\xrightarrow{s-t\to0}\to0$
Let $(E,d)$ be a compact metric space, $(T(t))_{t\ge0}$ be a strongly continuous contraction semigroup on $C(E)$, $(\Omega,\mathcal A,\operatorname P)$ be a probability space, $(X_t)_{t\ge0}$ be an $E$...
0
votes
1
answer
222
views
Definition of Factorization of Conditional Expectation
I believe this is a very silly question or I am overlooking something fairly simple but I cannot make sense of the factorization of the conditional expectation in a very concrete application:
I am ...
1
vote
1
answer
940
views
Equivalent Definitions of the Markov Property
Assume we have a stochastic process $\{X_n\}_\mathbb{N}$ defined on some underlying probability space that takes values in another measurable space. One of the many definitions that I have seen of ...
1
vote
1
answer
193
views
If $(κ_t)_{t≥0}$ is the transition semigroup of a continuous Markov process, is $t↦(κ_tf)(x)$ continuous for all bounded continuous $f$ and fixed $x$?
Let
$(\Omega,\mathcal A,\operatorname P)$ be a probability space
$(\mathcal F_t)_{t\ge0}$ be a filtration on $(\Omega,\mathcal A)$
$E$ be a metric space
$(X_t)_{t\ge0}$ be an $E$-valued right-...
0
votes
1
answer
104
views
Equivalence of discrete definition of Markov property in the coutinuous case
In the book, Lectures from Markov Processes to Brownian Motion, it is stated that the oldest definition of Markov property is, for every integer $n\ge1$ and $0\le t_1<t_2<\cdots<t<u,$ and $...
1
vote
0
answers
47
views
Markov property for unbounded function
Let $(X_t)$ be a Markov process with respect to a filtration $\mathcal{F}_t$. Assume that $P(X_t>0 \, \forall t\geq 0) = 1 $.
Denote $E_x$ the expectation under the measure where $X_0=x$.
Is it ...
1
vote
1
answer
79
views
Finite-dimensional conditional distributions of a Markov process
Let
$(\Omega,\mathcal A,\operatorname P)$ be a probability space
$I\subseteq\mathbb R$
$(\mathcal F_t)_{t\in I}$ be a filtration on $(\Omega,\mathcal A)$
$(E,\mathcal E)$ be a measurable space
$X$ be ...
1
vote
1
answer
255
views
Show some property of a Markov process
Let
$(\Omega,\mathcal A,\operatorname P)$ be a probability space
$I\subseteq\mathbb R$
$(\mathcal F_t)_{t\in I}$ be a filtration on $(\Omega,\mathcal A)$
$(E,\mathcal E)$ be a measurable space
$X$ be ...
1
vote
2
answers
122
views
Markov Property and FDDs
Let $X,Y$ be two discrete time $\mathbb{R}^n$-valued stochastic processes with the same finite dimensional distributions. It may be that $X,Y$ are defined on two different probability spaces. Now, if $...
0
votes
1
answer
77
views
Show that $ (e^{\alpha X_t} \int^ t_ 0 e ^{-\alpha X_u}du, t \geq 0) $ is a Markov process
I want to show that $ (e^{\alpha X_t} \int^ t_
0 e ^{-\alpha X_u}du, t \geq 0) $ is a Markov process
whereas $(\int^ t_
0 e ^{-\alpha X_u}du, t\geq 0)$ is not. Here $X_t$ is a Levy Process and ...
1
vote
1
answer
497
views
ARCH(1) process is a Markov process
I have a question about the ARCH(1) process. Let $(\Omega, \mathcal F, P)$ be a probability space, let $(Z_t)_{t \in \mathbb Z}$ be a sequence of i.i.d. real-valued random variables with mean zero and ...
0
votes
1
answer
477
views
Conditional expectation of a Brownian motion
Let $B$ be a Brownian motion. Fix times $0 < r < t$.
Write $\mathcal{D}$ for the space of paths traced out by continuous maps from $[0,t]$ to $\mathbb{R}$ with a suitable (e.g. Skorokhod) ...
1
vote
0
answers
395
views
Markov property - equivalent notions
Why are these different notions of the markov property equivalent:
$$\forall A\in\mathcal{S}\qquad \mathbb{P}(X_t\in
A|\mathcal{F}_s)=\mathbb{P}(X_t\in A|X_s)$$
$$\forall f:S\to\mathbb{R} \text{ ...