Skip to main content

All Questions

15 votes
1 answer
204 views

Show that $\mathbb{E}\left|\hat{f_n}-f \right| \leq \frac{2}{n^{1/3}}$ where $\hat{f_n}$ is a density estimator for $f$

Question Suppose we have a continuous probability density $f : \mathbb{R} \to [0,\infty)$ such that $\text{sup}_{x \in \mathbb{R}}(\left|f(x)\right| + \left|f'(x)\right|) \leq 1. \;$ Define the ...
yasinibrahim30's user avatar
4 votes
1 answer
372 views

Exponential bound for tail of standard normal distributed random variable

Let $X\sim N(0,1)$ and $a\geq 0$. I have to show that $$\mathbb{P}(X\geq a)\leq\frac{\exp(\frac{-a^2}{2})}{1+a}$$ I have no problem showing that $\mathbb{P}(X\geq a)\leq \frac{\exp(\frac{-a^2}{2})}{a\...
stats19's user avatar
  • 103
3 votes
1 answer
885 views

Lower bounds on sum of squared sub-gaussians

Letting $\left\{X_{i}\right\}_{i=1}^{n}$ be an i.i.d. sequence of zero-mean sub-Gaussian variables with parameter $\sigma,$ define $Z_{n} :=\frac{1}{n} \sum_{i=1}^{n} X_{i}^{2} .$ Prove that $$ \...
david's user avatar
  • 73
3 votes
1 answer
94 views

Minimum number of Bernoulli trials until sum reaches threshold with high probability

Let $X_1, X_2, \dots$ be i.i.d. $Bern(p)$ with $p\in (0, 1)$. Let $\delta \in (0,1)$ and $m \in \mathbb{N}$. What is the smallest integer $n \in \mathbb{N}$ such that $$P\left( \sum_{i=1}^n X_i \geq m ...
MATHX's user avatar
  • 153
3 votes
1 answer
2k views

Cramer-Rao Casella Berger 7.38 for exponential family

The question states ''let $X_{1}, \dots, X_{n}$ be random sample from $f(x \mid \theta) = \theta\cdot x^{\theta-1}$ for $0 < x< 1 ; \theta > 0$. Is there a function of $\theta, g(\theta)$ ...
sophie-germain's user avatar
3 votes
1 answer
81 views

The expected weight-ratio between weighted and un-weighted balls when picked from a bin without replacement

The Problem The problem, I believe, can be stated in the following way: Given $K$ white balls all with without weight (one can say that the weight is $0$) and $N - K$ red balls with individual ...
Johannes Ringmark's user avatar
2 votes
2 answers
118 views

Lower bound on the $\Phi$-entropy of a Gaussian variable

I am trying to prove that for $X$ a centered Gaussian variable, $$\limsup_{n\in\mathbb{N}}\,\mathbb{E}\left[(X+n)^2\log\left(\frac{(X+n)^2}{1+n^2}\right)\right]=2.$$ I already know by the Gaussian ...
John Do's user avatar
  • 652
2 votes
1 answer
2k views

Log det of covariance and entropy

I understand log of determinant of covariance matrix bounds entropy for gaussian distributed data. Is this the case for non gaussian data as well and if so, why? What does Determinant of Covariance ...
hearse's user avatar
  • 211
2 votes
0 answers
33 views

Lower bound of $\frac{\|(\mathbf X \otimes \mathbf X^\top)\theta\|_2^2}{np}$

According to Theorem 7.16 of High-Dimensional Statistics: A Non-Asymptotic Viewpoint (M. Wainwright, 2019), we know that for $\mathbf X\in\mathbb R^{n\times p}, X_{ij}\overset{iid}{\sim}N(0,1),$ there ...
Jasper Cha's user avatar
2 votes
0 answers
92 views

Probabilistic bound on difference of Lipschitz random function

I am currently facing the following problem : Let $(X_1,Z_1),\ldots,(X_n,Z_n)$ be $n$ i.i.d. sample points from some distribution $p$ supported on $\mathcal X\times\{-1,1\}$ where $\mathcal X\subseteq ...
Stratos supports the strike's user avatar
2 votes
0 answers
212 views

Hoeffding's Inequality Assumptions

I'm looking for the assumptions of the Hoeffding's inequality to check it is applicable to my problem. So far the only assumptions I can find are the variables $Z_i$ are IID and bounded. However, Im ...
curiouscat22's user avatar
2 votes
0 answers
70 views

Show that $\operatorname{Pr}(Z-X \geq 0)$ converges to one

Suppose that $V_i$, for $i \in \mathbb{N}$, are i.i.d. standard normal random variables and $Y_i = \sum_{k=1}^i V_k$ for $i \in \mathbb{N}$ with $Y_0 = 0$. Let $X_n = (\sum_{i=1}^n V_i Y_{i-1})^2 Y_n^...
KRL's user avatar
  • 1,180
1 vote
3 answers
906 views

Is there a way to bound expected value with limited information of the CDF?

Suppose I want to evaluate $E[X]$, where $X$ is a univariate random variable and takes values in $\mathcal{X}$, where the smallest element of $\mathcal{X}$ is 0 and the largest element of $\mathcal{X}$...
user52932's user avatar
  • 403
1 vote
2 answers
33 views

How to calculate the width of a variance

Short version I have a series of results that sit within clear upper and lower bounds relative to the starting value. I do not know how to find those bounds (and thus the width of the band). I would ...
Matthew Brown aka Lord Matt's user avatar
1 vote
1 answer
47 views

Variance of sum of deviations

Suppose I have an i.i.d. sample $\{X_i\}_{i=1}^M$ for some positive integer $M$, and suppose that $X_i \sim X$ for some random variable $X$ with finite variance. Then, denote by $$ E_M = \frac1M\sum_{...
G. Gare's user avatar
  • 1,450

15 30 50 per page