Skip to main content

All Questions

0 votes
0 answers
15 views

Does is hold that $E[f(X_t)1_{\{ s \geq T_1\}}| \mathcal F_{T_1}] = P_{t-T_1}(X_{T_1})1_{s \geq T_1}$ for $s\leq t$ and X is a strong Markov process

Let $X=(X_t)_{t\geq 0}$ be a homogeneous cadlag Markov process taking values in a finite state space $S$. Let $T_1$ be its first jump time and $f$ be a bounded measurable function. I would like to ...
mathnoob's user avatar
2 votes
1 answer
48 views

A cadlag Feller process for $\mathcal F$ is Markov w.r.t $\mathcal F_+$ (Th. 46, Chap. 1, Stochastic Integration - Protter)

In page 35 of the book Stochastic Integration by P. Protter, he defines a Feller process as follows: Then he states the following theorem. In the proof, he used the following strategy: Next, he ...
Jeffrey Jao's user avatar
1 vote
1 answer
47 views

Decomposing a general stopping time into stopping components

Let $(X_n)_{n \geq 0}$ be a discrete-time Markov chain taking values in a finite state space $S$, with transition matrix $P$. Let $(\mathcal F_n)_{n\geq 0}$ be the natural filtration and let $\tau \...
Jeffrey Jao's user avatar
5 votes
1 answer
85 views

Sum of conditional expectations of a bounded stochastic process

Is there a proof for the following statement or is there a counter-example? Let $\{X_t\}$ be a stochastic process adapted to the filtration $\{\mathcal{F}_t\}$. Assuming $0 \leq X_t \leq 1$, and $\...
Alireza Bakhtiari's user avatar
3 votes
0 answers
48 views

Is my proof of Markov Property for Reflected BM correct?

I want to show that $|B_{t}|$ is a Markov Process where, $B_{t}$ is a Standard Brownian Motion. I have seen the proof here and here. But I don't understand why the method below might fail (or if it's ...
Dovahkiin's user avatar
  • 1,285
1 vote
1 answer
47 views

Interpretation of condition probability of $X_0 = x$ for a Markov process and semigroup $(P_t)$ [closed]

When studying Markov processes, I have seen a lot of authors define the semigroup as $P_tf(x) = \mathbb E_x(f(X_t))$ (with the assumption that $X_t$ is homogeneous) and the call $\mathbb E_x$ as the &...
Jeffrey Jao's user avatar
1 vote
1 answer
88 views

How can we rigorously show conditional independence here?

Let $(E,\mathcal E,\lambda)$ be a measure space; $p:E\to[0,\infty)$ be $\mathcal E$-measurable with $$c:=\int p\:{\rm d}\lambda\in(0,\infty)$$ and $\mu$ denote the measure with density $\frac pc$ ...
0xbadf00d's user avatar
  • 13.9k
0 votes
1 answer
51 views

Equivalence of defintion for Markov Property

I've seen definitions of Markov Property for a process $X$ indexed with the positive integers with values in $S$, that are supposedly equavalent. I consider the canonical filtration $\mathcal F=(\...
Peter Strouvelle's user avatar
2 votes
0 answers
91 views

Conditional distribution of Gaussian process completely determined by conditional expectation

I am reading the book Stochastic Processes by J. L. Doob and trying to understand the argument that, for any (real-valued) Gaussian process $X = (X_t)_{t\ge0}$, the Markov property is characterized by ...
user486506's user avatar
1 vote
0 answers
22 views

Markov property of $X_t + \sigma Y_t$ when $\sigma \rightarrow 0$

Let $(X_t)$ and $(Y_t)$ be sample continuous stochastic processes on $[0,1]$ such that $Z_t (\sigma):= X_t + \sigma Y_t$, where $\sigma >0$, is Markov with regard to the filtration generated by $...
W. Volante's user avatar
  • 2,294
2 votes
1 answer
238 views

Showing an equivalence between the martingale property and a markov property.

I really am not sure how to get a rigorous answer to the following, any help would be greatly appreciated. Let $(X_n)_{n\geq0}$ be an integrable process, taking values in a countable set $E ⊆ \mathbb{...
verygoodbloke's user avatar
3 votes
2 answers
198 views

Markov transition kernels of process with independent increments

Suppose that $\{X_t : Ω → S := \mathbb{R}^d, t\in T\}$ is a stochastic process with independent increments and let $\mathcal{B}_t :=\mathcal{B}_t^X$ (natural filtration) for all $t\in T$. Show, for ...
edamondo's user avatar
  • 1,397
1 vote
0 answers
66 views

Is the Markov property under $\mathbb{P} $ preserved under change to a measure $\mathbb{Q} $ absolutely continuous to $\mathbb{P} $ .

Let $(\Omega,\mathcal{F}, \{\mathcal{F}_n \}, \mathbb{P})$, be a filtered probability space, $\mathcal{F}= \sigma\{F_n,n\in \mathbb{N}\} $, $M_n$ a nonnegative martingale, and $\mathbb{E} \mathrm{M}_n=...
yxyt's user avatar
  • 73
1 vote
1 answer
92 views

Relation between the strong Markov property of a process and the strong Markov property of the associated canonical process on the path space

Let $(\Omega,\mathcal A,\operatorname P)$ be a probability space; $(E,\mathcal E)$ be a measurable space; $\pi_I$ denote the projection from $E^{[0,\:\infty)}$ onto $I\subseteq[0,\infty)$ and $\pi_t:=...
0xbadf00d's user avatar
  • 13.9k
1 vote
0 answers
27 views

If $(Y_n)$ is iid, then $Z_n:=\sum_{i=1}^nY_i$ is Markov

Let $(\Omega,\mathcal A,\operatorname P)$ be a probability space, $(\mathcal F_n)_{n\in\mathbb N_0}$ be a filtration on $(\Omega,\mathcal A,\operatorname P)$, $E$ be a $\mathbb R$-Banach space, $(Y_n)...
0xbadf00d's user avatar
  • 13.9k
1 vote
2 answers
380 views

Sum of i.i.d. random variables is a Markov process

Let $\{Y_j\}_1^\infty$ be i.i.d. real random variables on a common probability space. $\forall n\in\mathbb{N}$, define $X_n = x_0 + \sum_{j=1}^nY_j$, where $x_0$ is a constant. Also define $X_0=x_0$. ...
RunningMeatball's user avatar
9 votes
2 answers
426 views

If $Y\sim\mu$ with probability $p$ and $Y\sim\kappa(X,\;\cdot\;)$ otherwise, what's the conditional distribution of $Y$ given $X$?

Let $(\Omega,\mathcal A,\operatorname P)$ be a probability space $(E,\mathcal E)$ be a measurale space $\mu$ be a probability measure on $(E,\mathcal E)$ $X$ be an $(E,\mathcal E)$-valued random ...
0xbadf00d's user avatar
  • 13.9k
0 votes
0 answers
90 views

If $X$ is a Feller process, then $\sup_{x\in E}\text E\left[d(X_s,X_t)\wedge1\mid X_0=x\right]\xrightarrow{s-t\to0}\to0$

Let $(E,d)$ be a compact metric space, $(T(t))_{t\ge0}$ be a strongly continuous contraction semigroup on $C(E)$, $(\Omega,\mathcal A,\operatorname P)$ be a probability space, $(X_t)_{t\ge0}$ be an $E$...
0xbadf00d's user avatar
  • 13.9k
0 votes
1 answer
222 views

Definition of Factorization of Conditional Expectation

I believe this is a very silly question or I am overlooking something fairly simple but I cannot make sense of the factorization of the conditional expectation in a very concrete application: I am ...
Fabian Werner's user avatar
1 vote
1 answer
940 views

Equivalent Definitions of the Markov Property

Assume we have a stochastic process $\{X_n\}_\mathbb{N}$ defined on some underlying probability space that takes values in another measurable space. One of the many definitions that I have seen of ...
user56628's user avatar
  • 313
1 vote
1 answer
193 views

If $(κ_t)_{t≥0}$ is the transition semigroup of a continuous Markov process, is $t↦(κ_tf)(x)$ continuous for all bounded continuous $f$ and fixed $x$?

Let $(\Omega,\mathcal A,\operatorname P)$ be a probability space $(\mathcal F_t)_{t\ge0}$ be a filtration on $(\Omega,\mathcal A)$ $E$ be a metric space $(X_t)_{t\ge0}$ be an $E$-valued right-...
0xbadf00d's user avatar
  • 13.9k
0 votes
1 answer
104 views

Equivalence of discrete definition of Markov property in the coutinuous case

In the book, Lectures from Markov Processes to Brownian Motion, it is stated that the oldest definition of Markov property is, for every integer $n\ge1$ and $0\le t_1<t_2<\cdots<t<u,$ and $...
Wei's user avatar
  • 183
1 vote
0 answers
47 views

Markov property for unbounded function

Let $(X_t)$ be a Markov process with respect to a filtration $\mathcal{F}_t$. Assume that $P(X_t>0 \, \forall t\geq 0) = 1 $. Denote $E_x$ the expectation under the measure where $X_0=x$. Is it ...
htd's user avatar
  • 1,774
1 vote
1 answer
79 views

Finite-dimensional conditional distributions of a Markov process

Let $(\Omega,\mathcal A,\operatorname P)$ be a probability space $I\subseteq\mathbb R$ $(\mathcal F_t)_{t\in I}$ be a filtration on $(\Omega,\mathcal A)$ $(E,\mathcal E)$ be a measurable space $X$ be ...
0xbadf00d's user avatar
  • 13.9k
1 vote
1 answer
255 views

Show some property of a Markov process

Let $(\Omega,\mathcal A,\operatorname P)$ be a probability space $I\subseteq\mathbb R$ $(\mathcal F_t)_{t\in I}$ be a filtration on $(\Omega,\mathcal A)$ $(E,\mathcal E)$ be a measurable space $X$ be ...
0xbadf00d's user avatar
  • 13.9k
1 vote
2 answers
122 views

Markov Property and FDDs

Let $X,Y$ be two discrete time $\mathbb{R}^n$-valued stochastic processes with the same finite dimensional distributions. It may be that $X,Y$ are defined on two different probability spaces. Now, if $...
jpv's user avatar
  • 2,031
0 votes
1 answer
77 views

Show that $ (e^{\alpha X_t} \int^ t_ 0 e ^{-\alpha X_u}du, t \geq 0) $ is a Markov process

I want to show that $ (e^{\alpha X_t} \int^ t_ 0 e ^{-\alpha X_u}du, t \geq 0) $ is a Markov process whereas $(\int^ t_ 0 e ^{-\alpha X_u}du, t\geq 0)$ is not. Here $X_t$ is a Levy Process and ...
na1201's user avatar
  • 630
1 vote
1 answer
497 views

ARCH(1) process is a Markov process

I have a question about the ARCH(1) process. Let $(\Omega, \mathcal F, P)$ be a probability space, let $(Z_t)_{t \in \mathbb Z}$ be a sequence of i.i.d. real-valued random variables with mean zero and ...
numerion's user avatar
  • 683
0 votes
1 answer
477 views

Conditional expectation of a Brownian motion

Let $B$ be a Brownian motion. Fix times $0 < r < t$. Write $\mathcal{D}$ for the space of paths traced out by continuous maps from $[0,t]$ to $\mathbb{R}$ with a suitable (e.g. Skorokhod) ...
Frank's user avatar
  • 3,884
1 vote
0 answers
395 views

Markov property - equivalent notions

Why are these different notions of the markov property equivalent: $$\forall A\in\mathcal{S}\qquad \mathbb{P}(X_t\in A|\mathcal{F}_s)=\mathbb{P}(X_t\in A|X_s)$$ $$\forall f:S\to\mathbb{R} \text{ ...
julbern's user avatar
  • 402
1 vote
0 answers
33 views

Equation with the expectation of a assessed Markov process

In my book about Markov processes there is following equation in a proof and I don't see why it's right, I already ask some people in the university, but I had no success, can somebody help me? $$E(\...
Nullmenge's user avatar
10 votes
1 answer
234 views

Exploiting the Markov property

I've encountered the following problem when dealing with short-rate models in finance and applying the Feynman-Kac theorem to relate conditional expectations to PDEs. Let $(\Omega,\mathcal{F},\{\...
JohnSmith's user avatar
  • 1,524
6 votes
1 answer
1k views

How to Prove that a (Centered) Gaussian Process is Markov if and only if this Equation Holds?

A centered Gaussian process is Markov if and only if its covariance function $\Gamma: \mathbb{R}\times\mathbb{R} \to \mathbb{R}$ satisfies the equality: $$\Gamma(s,u)\Gamma(t,t)=\Gamma(s,t)\...
Chill2Macht's user avatar
  • 21.3k
1 vote
1 answer
459 views

How to use the Markov property of Brownian motion

This is a problem from Durrett's probability with examples, exercise 8.2.1. It is not homework. The exercise states: Let $T_0 = \inf\{s > 0 : B_s = 0\}$ and let $R = \inf\{t > 1 : B_t = 0\}$. ...
Brownianmotionhurtsmyhead's user avatar
3 votes
1 answer
640 views

Stationary Markov process properties

Let $X$ be a right-continuous process with values in $(E,\mathcal{E})$, defined on $(\Omega, \mathcal{F}_t,P)$. Suppose that $X$ has stationary, independent increments. I now want to show the ...
user avatar
1 vote
1 answer
49 views

Markov Processes: $P_x$ and $E_x$

In the study of Markov processes, one usually introduces the measures $P_{\pi}$ on the path space of the process where $\pi$ is an initial distribution of the process $X$ i.e $\pi=\mathcal L(X_0)$. ...
user avatar
2 votes
1 answer
380 views

Markov property for a stochastic process with discrete state space.

Consider a stochastic process $\{X_s\}_{s\in\mathcal S\subseteq\mathbb R}$ with value in $(\mathbb R,\mathcal B(\mathbb R))$ adapted to a filtration $\{\mathcal F_s\}$ (we can suppose that $\{\...
Dubious's user avatar
  • 13.5k