All Questions
Tagged with statistics stochastic-processes
712
questions
1
vote
0
answers
69
views
Stochastic Simulation - Simulation from the Marginal Distributions
I am reviewing some material on MCMC / simulation and I realised I never quite understood this point. Given a joint distribution $f(x_1, ..., x_n) = f(x_1) f(x_2 | x_1)...f(x_n | x_{n-1}, ..., x_1)$ ...
4
votes
1
answer
442
views
Popular mistakes in probability
Question: What not-trivial mistakes do students often make when solving problems in probability theory, mathematical statistics and random processes?
Some examples of wrong solutions:
Problem 1: Find ...
1
vote
0
answers
43
views
Optimal matching of Bernoulli random variables
Let $Z_1$, ..., $Z_n$ be a sequence of independent Bernoulli random variables such that
for all $i\in\left\{1,..,n\right\}$ $Z_i\sim\mathcal{B}(p_i)$ where $p_i < 1/2$.
Define $l(x_{1:n}, y_{1:n}) =...
0
votes
1
answer
101
views
Conditional expectation with random variables from different Probability spaces [closed]
Let $(X,\mathcal{F}_X,\mathbb{P})$ and $(Y,\mathcal{F}_Y,\mathbb{Q})$ be two probability spaces. I know that the expectation of random variable $Z:X\rightarrow \mathbb{R}$ is affected by the random ...
1
vote
0
answers
44
views
Renewal reward process's reward tail probability
Suppose we are given a dice with $K$ faces, denoted by $k=1,\dots,K$, where the probability of realizing a face $k$ is $p_k\in[0,1]$ with $\sum_{k=1,\dots,K}p_k=1$.
Now, we roll the dice repetitively. ...
1
vote
1
answer
106
views
Probability that a probability will be less than a certain value
Suppose I have a nonnegative random variable $X$ and I don't know its expected value, but I know that its expected value is less than or equal to $a$ with at least probability $p^*$. i.e, $\mathbb{P}(\...
0
votes
1
answer
34
views
Markov Chain Detailed Balance $\pi(x)*P(x, y) = \pi(y)*P(y, x)$
Let's say I have a Markov chain and it has a transition matrix denoted as $P$. The $(row, column)$ elements of the $P$ matrix are denoted as $P(i, j)$. Just by looking at the transition matrix $P$, ...
0
votes
2
answers
46
views
Probability 2 earthquakes happen in a period of time.
The amount of earthquakes that happen at island X follows the Poisson process with mean 2 . Given that 2 earthquakes have happened in this year, find the probability both the earthquakes happen ...
1
vote
1
answer
13
views
Expectation of the process adapted to the filtration of the Wiener process
Suppose $\sigma_t$ is a stochastic process adapted to the filtration $\mathcal{F}_t$ generated by the Wiener process $W_t$.
I would like to know how to compute the following expectation:
$$E = \mathbb{...
2
votes
0
answers
35
views
Difference between compensator of point process under real parameter an its MLE estimator
Suppose we have some point process $N=N_{\theta_0}$ on the real line, driven by a conditional intensity $\lambda_{\theta_0}$ dependent on some finite-dimensional parameter $\theta_0\in\Theta\subset\...
2
votes
1
answer
56
views
Convergence of weighted sum to Brownian Motion
Let $\{\varepsilon_t\}_{t = 1}^T$ be a sequence of iid random variables such that $\varepsilon_t \sim N(0, \sigma^2)$ and $\sigma^2 > 0$. Then it is known that (see 17.3.6 in James Hamilton's Time ...
3
votes
1
answer
965
views
Two independent Poisson processes.
I am trying to prove the result that exactly $k$ occurrences of a Poisson process before the first occurrence of another independent Poisson process is a geometric random variable.
\begin{align}
&...
0
votes
1
answer
2k
views
What is the probability of getting the same side n times in a row in a coin toss
Assuming everything is fair what are the odds that one of the two sides in a coin toss wins 6 times in a row within the first 6 tosses?
Please also answer for the general case ...
1
vote
0
answers
97
views
Unbiased Cumulant Estimate - Fifth Cumulant
I am searching the definition of the $5^{th}$ unbiased cumulant estimate.
Let $K_j$, be the $j$-th unbiased cumulant estimate of a probability distribution, based on the sample moments.
Let $m_j$ be ...
0
votes
1
answer
39
views
Sub-Gaussian $X_t$, prove $\mathbb{E}\left[\sup_{t\in T}X_t \right] \leq 2 \mathbb{E}\left[\sup_{\rho(t,s)\leq \delta}(X_t-X_s) \right]+J(\delta,T)$
This is a question-and-answer just for me, but if you have alternate answers or comments, feel free to share them.
Let $(T,\rho)$ be a metric space and $\{X_t\}_{t\in T}$ be a sub-Gaussian process ...
0
votes
1
answer
49
views
A Gaussian process and a Rademacher proecss are sub-Gaussian
This is a question-and-answer just for me, but if you have alternate answers or comments, feel free to share them.
Let $(T,\rho)$ be a metric space and $\{X_t\}_{t\in T}$ be a stochastic process ...
4
votes
0
answers
54
views
When $X_t$ is conditionally normal distributed and has density $p_t$, how can we compute $\text E\left[\left\|\nabla\ln p_t(X_t)\right\|^2\right]$?
Let $d\in\mathbb N$ and $(X_t)_{t\ge0}$ be an $\mathbb R^d$-valued process. Assume $$\operatorname P\left[X_t\in\;\cdot\;\mid X_0\right]=\mathcal N(X_0,\Sigma_t)\tag1$$ for some covariance matrix $\...
5
votes
2
answers
130
views
Ergodic series converge to the expectation?
Let $(X_i, Y_i)_{i\in\mathbb{N}}$ be a real-valued stochastic process. We say that $X$ is mean-ergodic, if $$\frac{1}{n}\sum_{i=1}^nX_i\to \mathbb{E}X_1$$ in probability as $n\to\infty$.
Let $S_n:=\{i\...
2
votes
1
answer
104
views
Asymptotic Gambler's Ruin Probability with Unequal Gain/Loss with Zero-Mean Payoff Distribution
The gambler's ruin problem with unequal gain/loss with a payoff distribution whose support is a finite subset of $\mathbb Z$ is an old problem; for example, see Feller (1968, Vol.1, Section XIV.8) and ...
1
vote
1
answer
117
views
How to deduce an expression of a specific conditional expression
The problem occurs when reading Bombardini et al., 2023, "Did US Politicians Expect the China Shock?", American Economic Review, Vol.1, PP174-209.
The authors define $\xi_{it}$ to be a ...
2
votes
1
answer
119
views
Covariance Operator corresponding to multivariate covariance function
The usual definition of a covariance operator on $L_2(D)$ is:
$$
C : L_2(D) \to L_2(D), \qquad (C \psi)(x) = \int_D c(x,y) \psi(y) dy \qquad \forall x\in D, ~~\psi \in L_2(D),
$$
where $c(x,y): D \...
1
vote
0
answers
14
views
M/M/1 Queues : Exclusive Queue Length is not Markov
For a M/M/1 queue let $N_q(t) = (Q(t)-1)^{+}$ be the number of customers in the queue except the one being served. We have to show that $N_q(t)$ is not a continuous-time Markov chain. [src: Sidney ...
0
votes
1
answer
28
views
Variance of time integral of a function on an Ito process
I was struggling a bit with the time integral of an Ito process.
Say I have this:
$$\int^T_t \alpha\circ X_t ds$$
Where $X_t$ is an Ito process, and $\alpha$ is a continuous function. What can we say ...
0
votes
1
answer
56
views
Can addition of noise to dynamical system reduce estimation errors
I am using Kalman filter to estimate the states of a stochastic dynamical system which has very very small noise( consider zero ). The filter is not aware that the noise is zero. Implementation of KF ...
0
votes
0
answers
64
views
How to find joint pdf
The random arrival of $k$ phone calls within a time interval of length $t$ is described by the following pdf
$$f(t) = \frac{\lambda^k}{(k-1)!} t^{k-1} e^{-\lambda t}$$
where parameter $\lambda$ ...
0
votes
1
answer
43
views
Question on determining the posterior pdf
Can someone tell me how the pdf of noise (w) is equivalent to the conditional pdf of observations (x) given A, assuming noise is independent of A for the equation x[n]=A+w[n] where A is the mean (and ...
0
votes
0
answers
28
views
Genomic and sum of geometric random variables
In their paper The Maximum of independent Geometric Random Variables as the Time for Genomic Evolutionthe authors noted that if to consider the genomic word of L letters, than the measure of the time ...
1
vote
0
answers
47
views
Mean value of sqrt. root cox-ingersoll-ross process
Consider the so-called Cox-Ingersoll-Process model
\begin{equation}
dr_t=a(b-r_t)dt+\sigma \sqrt{r_t}dW_t
\end{equation}
It can be shown (Wikipedia), that the mean of this process is
\begin{...
2
votes
1
answer
99
views
Variance recursion formula in Galton-Watson process
Consider a Galton-Watson process with expected offspring $\mathbb{E}[\xi]=\mu<\infty$ and variance $\text{Var}(\xi)=\sigma^2<\infty$ where the offspring in generation $t\in\mathbb{N}$ is given ...
3
votes
1
answer
94
views
Existence of Malthusian parameter
Consider a continuous time point process $\eta(t)$ representing the number of points in the interval $[0, t]$. Let $\eta(\infty)$ be distributed as the total number of children of a particle. Define $\...
0
votes
0
answers
29
views
What are necessary and sufficient conditions for the Bellman Equation to be solvable?
I am studying Markov Rewaed Processes right now, and I wish to gain a deeper understanding of the Bellman equation's relationship with them.
I learned the Bellman equation in the following form:
$v = ...
6
votes
1
answer
94
views
Show that $X_{t}:=\alpha X_{t-1}+\epsilon_{t}$ is strictly stationary for $|\alpha|<1$ and $\epsilon_{t}$ i.i.d$~\sim N(0,\sigma^{2})$.
The title can be shortened to "prove that $AR(1)$ processes are strictly stationary when $|\alpha|<1$". This has been discussed many times on MSE and Cross Validated, but I found no ...
2
votes
2
answers
299
views
Rough path expected signature vs cumulant-generating function / characteristic function
What is the point of using rough path expected signature to characterize the law of а stochastic process when the cumulant generating function is known ($\log\mathbb{E}[e^{i\theta X(t)}]$)?
Since an ...
0
votes
0
answers
31
views
Approximation of a new kernel by a linear combination of previous kernels
From the reference by Knutsen, page 25, Kernel linear independence test is explained
Knutsen, Sverre. "Gaussian processes for online system identification and control of a quadrotor." (2019)...
9
votes
1
answer
278
views
Expected difference between the largest and second largest observations in a sample of i.i.d. normal variables
Let $X_1,\dots,X_n$ be an i.i.d. sample from the standard normal distribution. Let
\begin{align}
\mu_n = \mathbb{E}[X_{(n)} - X_{(n-1)}],
\end{align}
be the expected difference between the largest ...
0
votes
0
answers
43
views
Expected value of dirac measure
Let $x_n$ be a series in $\mathbb R^d$. When does $\mu_n = \delta_{x_n}$ converge weakly?
This is my attempt to answer this. Weak convergence means:
$$\int f \, d \mu_n \rightarrow \int f \, d \mu$$ ...
0
votes
2
answers
68
views
How to take the variance of a second order expansion? $\text{Var}\left[aX+bY+cXY+mX^2+nY^2\right]$
How to take the variance of a second order expansion? $\text{Var}\left[aX+bY+cXY+mX^2+nY^2\right]$
Let say we have 5 real-valued constant parameters $\{a,\ b,\ c,\ m,\ n\}$, and two random variables $...
2
votes
1
answer
249
views
How far are the Mode and the Median of the Log-Normal distribution from behaving as Linear functions?
How far are the Mode and the Median of the Log-Normal distribution from behaving as Linear functions?
Intro_______________
Recently I made a question where later I figure out I was requiring that the ...
3
votes
1
answer
98
views
Magical relationship between Exponential distribution and Poisson process
Consider i.i.d. random variables $X_1,X_2,\ldots,X_n$ satisfying exponential distribution $\operatorname{Exp}(1)$. Let $Y=X_1+X_2+\ldots+X_n$. We know that the p.d.f. of $Y$ is the Gamma distribution
$...
0
votes
0
answers
122
views
Let be $Z_1, Z_2, \cdots, Z_n$ independent random variables with mean $0$ and variance $\sigma^2 < \infty$. Let $X_n=Z_1+Z_2+\cdots, Z_n$.
Let be $Z_1, Z_2, \cdots, Z_n$ independent random variables with mean $0$ and variance $\sigma^2 < \infty$. Let $X_n=Z_1+Z_2+\cdots, Z_n$. I'm trying to prove that $\mathbb{E}(X_nX_m)=\min(n,m)\...
0
votes
0
answers
66
views
Sample Space of such a random variable
Suppose there is a random experiment in which a person is asked to flip a coin $3$ times. The coin has $2$ sides (numbers and pictures). In the sample space of the random experiment, $3$ random ...
1
vote
0
answers
30
views
Moment Generating Function of $\bar{M}−\bar{N}$
What is the Mean value of $\bar{M}−\bar{N}$; Moment Generating Function of $\bar{M}−\bar{N}$; and Variance of $\bar{M}−\bar{N}.$ Given $M_1,M_2,\dots,M_n$ is a random sample of size $p$ from the Gamma ...
4
votes
1
answer
232
views
Estimating the value of $\sigma$ for Brownian motion
Let $X_t=\sigma W_t$ be a stochastic process, where $W_t$ is the Wiener process and $\sigma$ is an unknown parameter.
I want a formula to estimate the value of $\sigma$ (which could not be found in ...
0
votes
0
answers
22
views
Consider the stochastic process $\{X_t\}$ such that $\mathbb{P}(X_0=1)=\mathbb{P}(X_0=-1)=1/2$. $X_t$ changes sign in Poisson times.
Consider the stochastic process $\{X_t\}$ such that $\mathbb{P}(X_0=1)=\mathbb{P}(X_0=-1)=1/2$. $X_t$ changes sign in Poisson times, that is, the probability of $k$ changes of sign in a time interval ...
2
votes
1
answer
53
views
Suppose $g$ is a periodic function with period $k$. Let the stochastic process be defined as $X_t=g(t+T)$, where $T\sim U(0,k)$.
Suppose $g$ is a periodic function with period $k$. Let the stochastic process be defined as $X_t=g(t+T)$, where $T\sim U(0,k)$. I'm trying to prove that $\{X_t:t \geq0\}$ is weak-sense stationary. I ...
0
votes
0
answers
25
views
Let $\{X_t\}$ be a stochastic process such that $\mathbb{E}(X_t)=3$ and $R_X(t,s)=5+e^{\frac{1}{2}|t-s|}$.
Let $\{X_t\}$ be a stochastic process such that $\mathbb{E}(X_t)=3$ and $R_X(t,s)=5+e^{\frac{1}{2}|t-s|}$. Find the mean, variance and covariance of $X_2$ and $X_5$.
I know that for all $t$, $\mathbb{...
4
votes
2
answers
122
views
What is the relation between rate and probability? [closed]
Consider the specific example where a row of light bulbs (initially off), arbitrarily turn on at different rates. For a light bulb at position $x$, call this rate $f(x)$. This is an irreversible ...
0
votes
0
answers
41
views
Consider the stochastic process $\{X_t\}$ given by $X_t=\cos(2\pi ft+\alpha\beta)$
Consider the stochastic process $\{X_t\}$ given by $X_t=\cos(2\pi ft+\alpha\beta)$, where $f,t \in \mathbb{R}$ and $\alpha \sim Bernoulli(p)$ and $\beta \sim U\left(-\pi/2, \pi/2 \right)$. Find the ...
0
votes
2
answers
62
views
Let $\{X_t:t \geq 0\}$ be a stochastic process with independent increments and mean function $m_{X}(t)=E(X_t) < \infty$.
Let $\{X_t:t \geq 0\}$ be a stochastic process with independent increments and mean function $m_{X}(t)=E(X_t) < \infty$. Let $0<t_1<\dots<t_n<t_{n+1}$. Show that
$$
E\left[X_{t_{n+1}}|...
0
votes
0
answers
4k
views
The expected number of throws of $n$-faced dice till the sum of number is a multiple of $m$
Consider a fair dice with $n$ sides, numbered from $1$ to $n$. Imagine this dice is rolled over and over until the cumulative sum of the rolls is divisible by $m$. The question we seek to answer is: ...