Skip to main content

All Questions

2 votes
0 answers
33 views

Lower bound of $\frac{\|(\mathbf X \otimes \mathbf X^\top)\theta\|_2^2}{np}$

According to Theorem 7.16 of High-Dimensional Statistics: A Non-Asymptotic Viewpoint (M. Wainwright, 2019), we know that for $\mathbf X\in\mathbb R^{n\times p}, X_{ij}\overset{iid}{\sim}N(0,1),$ there ...
Jasper Cha's user avatar
0 votes
1 answer
58 views

Bounds on the ratio between second raw moment and expected of absolute value squared

I'm interested in bounds for the ratio $E[X^2]/E[|X|]^2$. The best lower bound is $1$ since $E[|X|] \geq E[X]$ and $E[X^2] - E[X]^2 = Var(X) \geq 0$. On the other hand, I would like to know if there ...
Bridi's user avatar
  • 63
3 votes
1 answer
94 views

Minimum number of Bernoulli trials until sum reaches threshold with high probability

Let $X_1, X_2, \dots$ be i.i.d. $Bern(p)$ with $p\in (0, 1)$. Let $\delta \in (0,1)$ and $m \in \mathbb{N}$. What is the smallest integer $n \in \mathbb{N}$ such that $$P\left( \sum_{i=1}^n X_i \geq m ...
MATHX's user avatar
  • 153
1 vote
0 answers
27 views

Tight bounds for the expected maximum value of k IID Binomial(n, p) random variables

What is the tightest lower and upper bound for the expected maximum value of k IID Binomial(n, p) random variables I tried to derive it : $$Pr[max \leq C] = (\sum_{i = 0}^C {n \choose i}p^i(1 - p)^i)^...
Goli Emami's user avatar
4 votes
1 answer
372 views

Exponential bound for tail of standard normal distributed random variable

Let $X\sim N(0,1)$ and $a\geq 0$. I have to show that $$\mathbb{P}(X\geq a)\leq\frac{\exp(\frac{-a^2}{2})}{1+a}$$ I have no problem showing that $\mathbb{P}(X\geq a)\leq \frac{\exp(\frac{-a^2}{2})}{a\...
stats19's user avatar
  • 103
2 votes
0 answers
92 views

Probabilistic bound on difference of Lipschitz random function

I am currently facing the following problem : Let $(X_1,Z_1),\ldots,(X_n,Z_n)$ be $n$ i.i.d. sample points from some distribution $p$ supported on $\mathcal X\times\{-1,1\}$ where $\mathcal X\subseteq ...
Stratos supports the strike's user avatar
1 vote
0 answers
102 views

Concentration bound for the distribution of the difference of two random variables

If we use $\Rightarrow$ to represent convergence in distribution and suppose that $X_n \Rightarrow N(0,\sigma_1)$ and $Y_n \Rightarrow N(0,\sigma_2)$, and $X_n$ and $Y_n$ are independent, then we all ...
lmz's user avatar
  • 11
2 votes
2 answers
118 views

Lower bound on the $\Phi$-entropy of a Gaussian variable

I am trying to prove that for $X$ a centered Gaussian variable, $$\limsup_{n\in\mathbb{N}}\,\mathbb{E}\left[(X+n)^2\log\left(\frac{(X+n)^2}{1+n^2}\right)\right]=2.$$ I already know by the Gaussian ...
John Do's user avatar
  • 652
15 votes
1 answer
204 views

Show that $\mathbb{E}\left|\hat{f_n}-f \right| \leq \frac{2}{n^{1/3}}$ where $\hat{f_n}$ is a density estimator for $f$

Question Suppose we have a continuous probability density $f : \mathbb{R} \to [0,\infty)$ such that $\text{sup}_{x \in \mathbb{R}}(\left|f(x)\right| + \left|f'(x)\right|) \leq 1. \;$ Define the ...
yasinibrahim30's user avatar
1 vote
0 answers
52 views

Does logistic regression not fulfill an inequality required for Wilks' Theorem or am I missing something?

The required inequality: Wilks' Theorem is given in the source below as Theorem 12.4.2, p. 515. Before stating the inequality, some definitions are needed: Let $Z_1, \dots, Z_n$ be i.i.d. according to ...
MathStudent's user avatar
2 votes
0 answers
212 views

Hoeffding's Inequality Assumptions

I'm looking for the assumptions of the Hoeffding's inequality to check it is applicable to my problem. So far the only assumptions I can find are the variables $Z_i$ are IID and bounded. However, Im ...
curiouscat22's user avatar
1 vote
0 answers
131 views

High probability upper bound for linear combination of Gaussian random variables

Suppose that $x_1, \dots, x_n$ are i.i.d. with $x_i \sim N(0,I_k)$. Let $A_1, \dots, A_n$ be matrices with dimension $k \times k$ and $\|A_i\|_2 \leq 1$. Consider the following random vector $$y = \...
KRL's user avatar
  • 1,180
1 vote
3 answers
906 views

Is there a way to bound expected value with limited information of the CDF?

Suppose I want to evaluate $E[X]$, where $X$ is a univariate random variable and takes values in $\mathcal{X}$, where the smallest element of $\mathcal{X}$ is 0 and the largest element of $\mathcal{X}$...
user52932's user avatar
  • 403
0 votes
0 answers
45 views

Using Markov's Inequality to Derive a Conclusion about random variable

I'm wondering whether I can use Markov's inequality to reach the following statement: Given Markov's inequality on a non-negative random variable X: $ P[X\geq a] \leq \frac{E[X]}{a}$ We can do the ...
kentropy's user avatar
  • 548
1 vote
1 answer
354 views

Rademacher Complexity Result

I was looking at one of the Rademacher Generalisation bound proofs, which says: If $G$ is a family of functions mapping from $Z$ to $[0, 1]$ and $\mathcal{R_m}(G)$ denotes the Rademacher Complexity ...
Ambar's user avatar
  • 127

15 30 50 per page