11
votes
Accepted
Confusion on defining uniform distribution on hypersphere and its sampling problem
The uniform distribution would be :
\begin{equation*}
f(x) \equiv \frac{\Gamma(\frac{d}{2})}{2 \pi^{\frac{d}{2}}}
\end{equation*}
This all sounds reasonable and conventional
You haven't defined a ...
4
votes
Probability of 3 darts landing in the same half of the board
The “darts on a board” version is equivalent to the “points on a circle” version. When you throw darts on a board, you can radially project the darts onto the circumference of the dartboard; these ...
2
votes
Accepted
Are $\mu_{\hat{p}}$ and $\sigma_{\hat{p}}$ considered parameters or statistics?
There are a variety of definitions of both, but for me a "parameter" is a value that underpins the behaviour of some random variable, while a "statistic" is a value calculated from ...
2
votes
Confusion on defining uniform distribution on hypersphere and its sampling problem
Given any $d$-dimensional surface $S \subset \mathbb{R}^n$ (or even any Riemannian manifold $S$), there is a surface measure $\mu$ on $S$, which is defined in local coordinates by $\mu(dx) = \sqrt{g(x)...
1
vote
Are $\mu_{\hat{p}}$ and $\sigma_{\hat{p}}$ considered parameters or statistics?
Moments of a (parametric) distribution are parameters. That's why they are called parametric distributions. That the distribution is a sampling distribution is immaterial.
For instance, if we have ...
1
vote
Confusion on defining uniform distribution on hypersphere and its sampling problem
Regarding the definition
This is a very natural question to ask, because the word "uniform" has multiple meanings.
For instance, the nature of uniformity when considering a set of n points, ...
1
vote
Confusion on defining uniform distribution on hypersphere and its sampling problem
Your first method does not give a correct uniform sampling of the hypersphere. By this, I mean if you have a generic function $f$ defined on the sphere:
$$
\int fd\mu \neq \langle f(x)\rangle
$$
with $...
1
vote
Accepted
Finding Probability Limit
If you want a direct argument, you can just modify the proof of the WLLN (for square integrable independent RVs), which is short anyways.
Assuming your samples are i.i.d., then $E[W_n] = \frac{n-1}{n}\...
Only top scored, non community-wiki answers of a minimum length are eligible
Related Tags
sampling × 1391statistics × 557
probability × 514
probability-distributions × 250
normal-distribution × 118
probability-theory × 109
statistical-inference × 91
monte-carlo × 83
sampling-theory × 77
random-variables × 67
combinatorics × 59
expected-value × 46
random × 46
estimation × 40
variance × 37
uniform-distribution × 33
confidence-interval × 33
signal-processing × 31
central-limit-theorem × 31
algorithms × 30
bayesian × 28
means × 25
parameter-estimation × 25
geometry × 24
markov-chains × 24