Skip to main content

Questions tagged [probability-theory]

For questions solely about the modern theoretical footing for probability, for example, probability spaces, random variables, law of large numbers, and central limit theorems. Use [tag:probability] instead for specific problems and explicit computations. Use [tag:probability-distributions] for specific distribution functions, and consider [tag:stochastic-processes] when appropriate.

0 votes
0 answers
13 views

LLN's that can be applied when the size of the vector also goes to infinity?

suppose $\hat{\mathbf{X}}$ is an an empirical average of $\mathbf{X}=\left[\mathbf{x}_1, \mathbf{x}_2, \ldots, \mathbf{x}_n\right] \in \mathbb{R}^{p \times n}$ I had written the following only to be ...
the_firehawk's user avatar
  • 2,425
0 votes
0 answers
47 views

If $X_n$ is martingale, $N$ is a stopping time, is $X_{n+N}$ a martingale?

Is this true? If it is, can we change martingale to sub or super? My attempt (On submartingale): $\mathbb{E}[X_{n+N+1}\vert X_{n+N}]=\mathbb{E}[\mathbb{E}[X_{n+N+1}\vert N,X_{n+N}]\vert X_{n+N}]\ge \...
Ho-Oh's user avatar
  • 919
1 vote
1 answer
31 views

Questions in proving $\mathbb{P}\left(T_a<\infty\right)=1$ with $T_a:=\inf \{t>0: B_t \ge a\}$

Let $\left(B_t, t \geq 0\right)$ be a one-dimensional Brownian motion starting from the origin (i.e, $\left.B_0=0\right)$. Let $\mathcal{F}_t:=\sigma\left(B_s: s \leq t\right)$ be the filtration ...
Ho-Oh's user avatar
  • 919
1 vote
1 answer
48 views

The Riesz representation theorem and probability density functions?

The Riesz representation theorem asserts that if the linear functional $L:C[a,b]\rightarrow\mathbb{R}$ is bounded (and hence continuous), then there exists an $\alpha\in BV[a,b]$ with $\operatorname{...
user775349's user avatar
0 votes
0 answers
17 views

Asymptotic behaviour of the difference process of the Polya urn

Assume we have a Polya urn with $(a,b)$ initial (time t=0) red and green balls, respectively. At each point in time we draw a ball at random and add it back together with further $S$ balls. This means ...
Irene's user avatar
  • 1
1 vote
1 answer
67 views

Probability of an event involving the maximum of (stochastically ordered) random variables

Let $X_1,\dots,X_n\geq 0$ be independent, continuous random variables, and assume that $X_i$ stochastically dominates $X_j$ if $i<j$. This means that for any $x>0$, \begin{align} \mathbb{P}(X_i\...
svonimir's user avatar
  • 381
1 vote
1 answer
52 views

Why is it harder to prove lower bounds on expected maxima (than upper bounds)?

I am currently learning upper and lower bounds for the expected maxima of random variables, e.g. bounds for the quantity: $$ \mathbb{E}\left[\max_{1 \leq i \leq n} X_i\right] $$ Most of the ...
rubikscube09's user avatar
  • 3,855
1 vote
0 answers
34 views

Deterministic part of Lévy-Khinchin formula for measures on $[0,\infty)$

I’m working on an exercise involving the deterministic part of the Lévy-Khinchin formula for an infinitely divisible probability distribution on $[0,\infty)$. A probability measure $\mu$ on $\mathbb R$...
D Ford's user avatar
  • 4,065
1 vote
0 answers
46 views

Why is the layer cake representation true

Like the title of the question, I am having a lot of trouble convincing myself of the layer cake representation. For clarity, let's use the following article as reference. https://en.wikipedia.org/...
some1fromhell's user avatar
-1 votes
1 answer
30 views

Are they sufficient conditions for joint normal distribution?

Suppose $\theta$, $u_1$, $u_2$ all follow normal distributions. Let $x_1$=$\theta+u_1$ and $x_2$=$\theta+u_2$. Are the above conditions sufficient to imply that $\theta$, $x_1$, and $x_2$ follow a ...
Ypbor's user avatar
  • 818
1 vote
0 answers
13 views

Markov process Transition Semigroup

I found 2 Definitions for a Markov process and I am trying to understand how they are connected. Let $X=\left(X_t\right)_{t\geq 0}$ be a $\mathcal{F}_t$ adapted Process. We say $X$ is a Markov ...
kays44's user avatar
  • 41
1 vote
0 answers
41 views

Optimal matching of Bernoulli random variables

Let $Z_1$, ..., $Z_n$ be a sequence of independent Bernoulli random variables such that for all $i\in\left\{1,..,n\right\}$ $Z_i\sim\mathcal{B}(p_i)$ where $p_i < 1/2$. Define $l(x_{1:n}, y_{1:n}) =...
Ibra's user avatar
  • 175
-1 votes
1 answer
34 views

Contradiction in probability change of variables? [closed]

While doing some calculations I ran into a simple yet seemingly bizarre contradiction. Assume we have PDFs $p_x,p_y$ s.t. $x\sim N(0,1), y\sim N(1,1)$. Thus, for $f(x)=x-1$ we have that $p_x(f(x))=p_y(...
Yonatan Gideoni's user avatar
0 votes
1 answer
54 views

Counter example to the identity theorem for two generating functions

I want to give an example of two generating functions $\psi_{X_+}$ and $\psi_{X_-}$ for random variables $X_+$ and $X_-$ with values in $\mathbb{N}_0$ which coincide on infinitely many points $x_i\in(...
Christoph Mark's user avatar
0 votes
0 answers
37 views
+50

Can we relax the assumptions and still get a.s. convergence?

Assume you have a sequence of independent Bernoulli random variables $X_i$ each with probability $p_i$. Let $c_i$ be a sequence of real numbers and $ m$ be a real numbers such that $c_1>0$ and with ...
user394334's user avatar
  • 1,222

15 30 50 per page
1
2 3 4 5
2993