23
votes
Is the variance of the mean of a set of possibly dependent random variables less than the average of their respective variances?
Yes, it is true. Here is a proof. $$
\begin{align}
\newcommand{\Var}{\operatorname{Var}}
&\Var(\overline{X})
\\
&= \frac1{n^2}\Var\left(\sum_{i=1}^n X_i\right)
\\
&=\frac1{n^2}\sum_{i=1}^n\...
14
votes
Accepted
Is the variance of the mean of a set of possibly dependent random variables less than the average of their respective variances?
In general, one has :
$$
\begin{align}
\operatorname{Var}\left(\sum_{k=0}^n X_k\right)
&= \sum_{i,j=0}^n \operatorname{Cov}(X_i,X_j)
\end{align}
$$
Now, the well-known inequality $ab \le \frac{...
13
votes
Is the variance of the mean of a set of possibly dependent random variables less than the average of their respective variances?
A way to see this at a glance is that real random variables form an inner product space, with $\langle X,Y \rangle = \mathbb{E}XY$.
The norm induced by this inner product is $\|X\|^2=\mathbb{E}X^2$, ...
4
votes
Is the variance of the mean of a set of possibly dependent random variables less than the average of their respective variances?
$$\text{Var}\left(\sum X_i\right) = \sum\limits_i \text{Var}\left( X_i\right) +\sum\limits_i \sum\limits_{j\not=i} \text{Cov}\left( X_i,X_j\right)$$ is maximised when the covariances take their ...
3
votes
Limit of coin flip probability sample variance
Variance is a measure of how spread out the distribution of results in an experiment is. If the coin is always landing on heads, or always landing on tails, then every experiment has the same result, ...
2
votes
Limit of coin flip probability sample variance
Maybe you need to take a Bayesian perspective and encode your beliefs in a prior. I think this post provides a good example which uses conjugate priors. If you chose $\beta$ and $\alpha$ as non-zero, ...
2
votes
Why do variances add when summing independent random variables?
I think you can indeed think about it in terms of Pythagorean theorem. So the point is that you can think of random variables as vectors in some space, so the variance of a random variable is like the ...
1
vote
Confusion in using applying variance formula
The variance of a sum of independent random variables $X$ and $Y$ is the sum of their variances; i.e.,
$$\operatorname{Var}[X+Y] \overset{\text{ind}}{=} \operatorname{Var}[X] + \operatorname{Var}[Y],$$...
1
vote
Why do variances add when summing independent random variables?
The other answers explain how the additivity of variance for sums of independent random variables on a probability space $(\Omega, \mathcal{F}, \mathbb{P})$ follows from orthogonality/Pythagoras' ...
Only top scored, non community-wiki answers of a minimum length are eligible
Related Tags
variance × 2570probability × 1151
statistics × 836
expected-value × 663
random-variables × 385
covariance × 294
probability-distributions × 282
probability-theory × 250
normal-distribution × 180
means × 178
standard-deviation × 139
conditional-expectation × 93
parameter-estimation × 87
stochastic-processes × 63
correlation × 57
estimation × 49
poisson-distribution × 46
conditional-probability × 45
binomial-distribution × 44
uniform-distribution × 41
linear-algebra × 40
integration × 40
solution-verification × 39
linear-regression × 39
sampling × 37