Skip to main content

Questions tagged [probability-theory]

For questions solely about the modern theoretical footing for probability, for example, probability spaces, random variables, law of large numbers, and central limit theorems. Use [tag:probability] instead for specific problems and explicit computations. Use [tag:probability-distributions] for specific distribution functions, and consider [tag:stochastic-processes] when appropriate.

0 votes
0 answers
19 views

Does $\text{Var}((AWy)^TAWy) \geq \text{Var}((AW_1y)^TAW_1y)$?

Suppose $W$ is an $n\times n$ random matrix with each entry i.i.d. $\sim N(0,1)$, let $A = (WDW^T +\lambda I_n)^{-1}$ where $D$ is a diagonal matrix with every entry $>0$, $I_n$ the idendity matrix ...
FreshSSS's user avatar
0 votes
0 answers
41 views

Verifying that a sum of compound-Poisson distributed random variables is convergent

Given a finite measure $\nu$ on $\mathbb R$, the compound Poisson distribution with intensity $\nu$ is: $$ \mathrm{CPoi}_\nu = e^{-\nu(\mathbb R)} \sum_{n=0}^\infty \frac{\nu^{*n}}{n!} $$ where $\nu^{*...
D Ford's user avatar
  • 4,065
0 votes
1 answer
48 views

If $X$ is a uniformly distributed discrete random variable, what is the condition that $Y=\phi (X)$ is too?

More specifically I was solving the following problem: Let 𝑋 be a discrete random variable that is uniformly distributed over the set $S=\{−10, −9, ⋯ , 0, ⋯ , 9, 10\}$. Which of the following random ...
Awe Kumar Jha's user avatar
1 vote
0 answers
27 views

Centered Subgaussian Variables have better Properties

I am trying to understand the following proof: Main Confusion: In particular, I am having a very hard time understanding the chain of inequalities in the proof for (3)': I think the first equality is ...
Partial T's user avatar
  • 583
0 votes
1 answer
17 views

Understanding the proof for Properties of Subgaussian Variables

Here are the definitions, statements and the proof that I am stuck on: I am stuck on the last part of the proof where the author claims that setting $C = e$ automatically guaranties that (1) holds ...
Partial T's user avatar
  • 583
0 votes
0 answers
24 views

Conditioning a probability distribution on a certain event

Suppose I have a nice distribution given by its pdf $p(x)$ on $\mathbb{R}^n$. It is usually problematic to condition on sets with zero measure (eg the Borel-Kolmogorov paradox). Nonetheless, given a ...
Daniel Robert-Nicoud's user avatar
4 votes
1 answer
38 views

Counterexample for: $E(X | Y, Z) = f(Y)$ implies that $X$ independent of $Z$ given $Y$?

I am looking for a nice counterexample of the statement: if $\mathbb{E}(X | Y, Z) = f(Y)$ for a measurable function $f$ implies that $X$ is independent of $Z$ given $Y$, i.e. $X \perp Z | Y$. Any help ...
Grandes Jorasses's user avatar
1 vote
1 answer
50 views

Variance of normalized random vector

Let $X=(X_1, \ldots, X_n)$ be a random vector whose entries are all independent and identically distributed according to some distribution $f$ with finite moments. Let $\bar{X} = \frac{1}{n}\sum_{i=1}^...
fennel's user avatar
  • 35
0 votes
0 answers
20 views

Critical case of Galton-Watson Process

I'm reading the proof in Artheya, Branching Process (but I think this is a classical result) of the exponential limit law in the critical case of the Galton-Watson process i.e : $\mathcal{L}(Z_n \vert ...
user1343035's user avatar
0 votes
0 answers
25 views

Show that Y_n converge in distribution to a law [duplicate]

Let be $\{U_i\}_{i\geq 1}$ a family of random variables independent and with the same distribution, $U[0,1]$ (uniform on [0,1]). Define $Y_n=\max_{1\leq i \leq n}\frac{U_i}{i}$ and show that $Y_n$ ...
Nicolas Rodriguez's user avatar
5 votes
0 answers
107 views
+50

probability of two confined randomly walking bodies overlapping

EDIT: I have tried to rephrase the problem, title, and context to my solution I am wondering about expanding a problem I have to the continuous domain. The problem is defined as such: Problem Given $N$...
gokudegrees's user avatar
0 votes
0 answers
49 views

Show that $S(\omega) := \sum _{i=1}^{Y(\omega)} X_i(\omega)$ is a square-integrable random variable in $L^2(\Omega,\mathcal A, P)$

Let $n \in \Bbb N,M \geq 0$ be constants,$( \Omega,\mathcal A,P)$ be a probability space, $Y : \Omega \to \{0,1,\dots ,n\}$ a random variable and $X_1,\dots, X_n : [-M,M]$ random variables with: (i) ...
gagamaga's user avatar
1 vote
1 answer
36 views

Change of Integral measure Involving a Supremum and Minimization with a Function on a Standard Probability Space

Let $(X,\mathcal B, \mu)$ be a Standard Probability space and $0<\beta<1$. Let $A\in \mathcal B$ such that $0<\mu(A)<\infty$. Let $\varphi:A\to \mathbb R$ be a map. Define $\omega:=\sup\...
abcdmath's user avatar
  • 2,007
0 votes
1 answer
36 views

Is the set of probability measures on $[0,1]$ that induces a continuous distribution function weak * compact?

I am an student in economics and I am trying to solve a fixed point problem, where the inputs of the function are probability measures. I'm trying to figure out whether the below results are true, but ...
djsteve's user avatar
0 votes
0 answers
121 views

On a conditional expectation property; substitution rule

For days now, I've been trying to prove the identity $$E(f(X,Y)\mid Y=y)=E(f(X,y)\mid Y=y).$$ I have found a couple of posts about this identity, mainly this one, and the more I think about this and ...
psie's user avatar
  • 801

15 30 50 per page