Questions tagged [statistics]
Mathematical statistics is the study of statistics from a mathematical standpoint, using probability theory and other branches of mathematics such as linear algebra and analysis.
11,530
questions with no upvoted or accepted answers
58
votes
0
answers
2k
views
Does the average primeness of natural numbers tend to zero?
Note 1: This questions requires some new definitions, namely "continuous primeness" which I have made. Everyone is welcome to improve the definition without altering the spirit of the question. Click ...
23
votes
1
answer
769
views
Kähler Geodesics
Consider the Kähler manifold in coordinates $(a,b)$ given by the complex Riemannian metric
$$\begin{pmatrix} \frac{1}{1-|a|^2}&\frac{1}{1-a\bar{b}}\\\frac{1}{1-\bar{a}b}&\frac{1}{1-|b|^2}\end{...
13
votes
0
answers
256
views
Asymptotic behavior of recurrence $x_{n+1}=\mbox{Stdev}(x_1,\dots,x_n)$
Here $x_1>0$ is the initial condition and $x_{n+1}$ is defined by
$$x_{n+1}=\Big[\frac{1}{n}\sum_{k=1}^n x_k^2 -\frac{1}{n^2}\Big(\sum_{k=1}^n x_k\Big)^2 \Big]^{1/2}.
$$
Clearly, $x_n=\lambda_n \...
13
votes
0
answers
1k
views
Why is the partition function able to describe the whole system?
No matter what the real system or subject is, if there is a partition function $Z$, then these kind of identities hold
$$\langle X\rangle=\frac{\partial}{\partial Y}\left(-\log Z(Y)\right).$$
If one ...
12
votes
0
answers
9k
views
Rigorous Proof of Slutsky's Theorem
I was hoping to type up my proof of Slutsky's Theorem and get confirmation on the excruciating details being all correct...
Statement of Slutsky's Theorem:
$$\text{Let }X_n, \ X,\ Y_n,\ Y,\text{ share ...
12
votes
0
answers
246
views
Looking for references related to an inequality in order statistics
I was reading the paper "on the minimum of several random variables". In example 10 item (ii) it states:
Let $1\leq k\leq n$. Let $g_i,i\leq n$, be independent $N(0,1)$ Gaussian random variables. ...
11
votes
0
answers
356
views
Random walks in $\mathbb{Z}^2$
Consider a random walk on the integer lattice in the plane. If a “particle” making a random
walk arrives at a lattice point $p = (k_1,k_2)$ at the time $t$, then one of the four neighbors
$(k_1±1, k_2 ...
11
votes
0
answers
352
views
Donsker's Theorem for triangular arrays
Assume we have a sequence of smooth i.i.d. random variables $(X_i)_{i=1}^{\infty}$. Given $\alpha>0$, does some sort of Donsker's Theorem hold for $\left(\frac{X_i}{n^{\alpha}}\right)_{i=1}^n$? ...
11
votes
0
answers
1k
views
Idempotence and the Rao–Blackwell theorem
Original question:
In the Wikipedia article on the Rao–Blackwell theorem, we read:
In case the sufficient statistic is also a complete statistic, i.e., one which "admits no unbiased ...
10
votes
1
answer
298
views
Estimating Parameter - What is the qualitative difference between MLE fitting and Least Squares CDF fitting?
Given a parametric pdf $f(x;\lambda)$ and a set of data $\{ x_k \}_{k=1}^n$, here are two ways of formulating a problem of selecting an optimal parameter vector $\lambda^*$ to fit to the data. The ...
10
votes
0
answers
814
views
What is the variance of self-information (or surprisal)?
The self-information of an outcome $x_i$, or surprisal, is defined as:
$$
I(x_i)=-\log P(x_i),
$$
where $P$ means probability. This way, the Shannon entropy can be seen as the "average" or "expected" ...
10
votes
0
answers
7k
views
Exponential distribution unbiased estimator
Let $$X_1, \ldots, X_n \overset{iid}{\sim} Exp(\lambda), \quad \lambda > 0$$
The Maximum-Likelihood-Estimator is given by $$\widehat{\lambda} = \frac{1}{\frac{1}{n}\sum_{i=1}^{n}{X_i}} = \frac{n}{\...
10
votes
0
answers
877
views
the parametrization of a Gumbel in terms of a Gaussian
Extreme Value Distribution From a Gaussian. I was wondering how the parametrization of $\alpha$ and $\beta$ of a Gumbel $e^{-e^{-\frac{x-\alpha }{\beta }}}$ was done in terms of a cumulative Gaussian $...
9
votes
0
answers
169
views
Covering number/Metric Entropy of the unit ball with respect to Mahalanobis distance
Let $B$ denote the unit ball on $\mathbb{R}^d$ and $N(\epsilon, B, d)$ be the cardinality of the smallest $\epsilon$-cover of $B$. An epsilon cover is a set $T \subset B$ such that for any $x \in B$, ...
9
votes
0
answers
200
views
Distributions with 'Gaussian Tails'
In a paper I was reading, the following seemingly artificial assumption is used:
suppose $f$ is some probability density function on $\mathbb{R}^d$, and let $\phi$ denote the density of a $N(0,I_d)$ ...
9
votes
0
answers
2k
views
How can you measure how "shuffled" a deck of cards is?
A few days ago I asked for some methods of measuring how shuffled a deck of cards was. Predictably there were a lot of suggested methods, which got me thinking, which is the best one? I think it'd be ...
9
votes
2
answers
324
views
Limiting distribution of binary variable (Central limit theorem fails)
Suppose we have a random variable
$$Y_i = i \text{ with probability } \frac{1}{i}$$ and $0$ otherwise. Here all the $Y_i$ are independent.
We can redefine $X_i = Y_i -1 $ so that $E(X_i)=0$.
Then the ...
9
votes
0
answers
228
views
Is there a well-defined `uniform' distribution on $C([0, 1])$?
I'm wondering whether we can define a uniform distribution on the space of continuous functions over a compact set, e.g. $C([0, 1])$. If so, then how should I rigorously describe it? And how can I ...
9
votes
0
answers
544
views
The sum of eigenvalues of integral operator $S(f)(x)=\int_{\mathcal{X}} k(x,y)f(y)d\mu(y)$ is given by $\int_{\mathcal{X}} k(x,x) d\mu(x)$?
Setup: Let $(\mathcal{X},d_{\mathcal{X}})$ and $(\mathcal{Y},d_{\mathcal{Y}})$ be two separable metric spaces. Let $M^1(\mathcal{X})$ be the space of Borel probability measures on $\mathcal{X}$ with ...
9
votes
1
answer
265
views
Hottest Days of The Year
Recently, there has been much talk in the media of it being the hottest day of the year so far. It has always seemed to me that there are likely many more of these in the northern hemisphere than the ...
9
votes
0
answers
223
views
Finding an upper bound for $\frac{d}{d\theta}\beta^*(\theta)|_{\theta=\theta_0}$
Suppose that a random variable X has a distribution depending on a parameter $\theta$, $\theta \in \Theta$, and consider a test of hypothesis $H_0: \theta = \theta_0$ versus the alternative $H_1: \...
9
votes
0
answers
463
views
Does this calculation have a name, or a generic formulation?
Background Informatiom
I would appreciate help in identifying or explaining this operation:
To calculate each of the $n$ values of $f(\Phi)$:
Sample from the distribution of each of $i$ parameters, $\...
8
votes
0
answers
266
views
Only three types of limit of distributions truncated to a finite interval in the upper tail?
Suppose random variable $X$ has a continuous probability distribution with an unbounded upper tail; that is, the CDF of $X$ (call it $F$) is absolutely continuous and $F(x)<1$ for all $x\in\mathbb{...
8
votes
3
answers
19k
views
Choosing $H_0$ and $H_a$ in hypothesis testing
There seems to be some ambiguity or contradiction in how to correctly choose the null and alternative hypotheses, both online and in my instructor's notes. I'm trying to figure out if this stems ...
8
votes
0
answers
3k
views
empirical quantile function - uniform convergence
Let $X_1,...,X_n$ denote independent and identically distributed random variables, with $X_i \sim F$, $1 \leq i \leq n$. Assume $F$ is continuous. Then we know that its generalized inverse (quantile ...
7
votes
0
answers
153
views
Sum of two independent random variables: distribution function and quantile function
If $X,Y$ are two independent random variables with CDFs $F_X,F_Y$, their sum has CDF $F_X \star F_Y$ ($\star$ is the convolution product).
What can be said about the quantile function of $X+Y$ ? The ...
7
votes
0
answers
1k
views
How is Optimal Transport algorithmically related to the Assignment Problem?
In optimal transport, we calculate the distance between two probability measures $\mu$ and $\nu$ over the compact set $[a,b]\subset\mathbb R$, using the Earth Movers distance which is a special case ...
7
votes
0
answers
13k
views
Show that $Cov(\bar{y},\hat{\beta_1})=0$
Show that $Cov(\bar{y},\hat{\beta_1})=0$
For those unfamiliar with statistics, Cov(A,B) refers to the covariance function. $\bar{y}$ refers to the average of the response (dependent variable). $\hat{\...
7
votes
1
answer
1k
views
Sum of best X dice in Y dice rolled (or roll X pick best Y) odds/calculation
Background: In many pen and paper RPGs there is often an option or bonus/penalty to rolls that incorporates rolling multiples of the required die and taking the best or worst of those rolls for your ...
7
votes
0
answers
1k
views
How to get the general form of the solution of exercise 5.4-2 of CLRS as showed in wikipedia?
Exercise
Suppose that we toss balls into b bins until some bin contains two balls. Each toss is independent, and each ball is equally likely to end up in any bin. What is the expected number of ball ...