Skip to main content
23 votes

Is the variance of the mean of a set of possibly dependent random variables less than the average of their respective variances?

Yes, it is true. Here is a proof. $$ \begin{align} \newcommand{\Var}{\operatorname{Var}} &\Var(\overline{X}) \\ &= \frac1{n^2}\Var\left(\sum_{i=1}^n X_i\right) \\ &=\frac1{n^2}\sum_{i=1}^n\...
Mike Earnest's user avatar
  • 78.3k
14 votes
Accepted

Is the variance of the mean of a set of possibly dependent random variables less than the average of their respective variances?

In general, one has : $$ \begin{align} \operatorname{Var}\left(\sum_{k=0}^n X_k\right) &= \sum_{i,j=0}^n \operatorname{Cov}(X_i,X_j) \end{align} $$ Now, the well-known inequality $ab \le \frac{...
Abezhiko's user avatar
  • 10.3k
13 votes

Is the variance of the mean of a set of possibly dependent random variables less than the average of their respective variances?

A way to see this at a glance is that real random variables form an inner product space, with $\langle X,Y \rangle = \mathbb{E}XY$. The norm induced by this inner product is $\|X\|^2=\mathbb{E}X^2$, ...
Zoe Allen's user avatar
  • 5,633
4 votes

Is the variance of the mean of a set of possibly dependent random variables less than the average of their respective variances?

$$\text{Var}\left(\sum X_i\right) = \sum\limits_i \text{Var}\left( X_i\right) +\sum\limits_i \sum\limits_{j\not=i} \text{Cov}\left( X_i,X_j\right)$$ is maximised when the covariances take their ...
Henry's user avatar
  • 159k
3 votes

Limit of coin flip probability sample variance

Variance is a measure of how spread out the distribution of results in an experiment is. If the coin is always landing on heads, or always landing on tails, then every experiment has the same result, ...
mike1994's user avatar
  • 151
2 votes

Limit of coin flip probability sample variance

Maybe you need to take a Bayesian perspective and encode your beliefs in a prior. I think this post provides a good example which uses conjugate priors. If you chose $\beta$ and $\alpha$ as non-zero, ...
J.K.'s user avatar
  • 270
2 votes

Why do variances add when summing independent random variables?

I think you can indeed think about it in terms of Pythagorean theorem. So the point is that you can think of random variables as vectors in some space, so the variance of a random variable is like the ...
Alan Chung's user avatar
  • 1,222
1 vote

Confusion in using applying variance formula

The variance of a sum of independent random variables $X$ and $Y$ is the sum of their variances; i.e., $$\operatorname{Var}[X+Y] \overset{\text{ind}}{=} \operatorname{Var}[X] + \operatorname{Var}[Y],$$...
heropup's user avatar
  • 141k
1 vote

Why do variances add when summing independent random variables?

The other answers explain how the additivity of variance for sums of independent random variables on a probability space $(\Omega, \mathcal{F}, \mathbb{P})$ follows from orthogonality/Pythagoras' ...
JKL's user avatar
  • 2,159

Only top scored, non community-wiki answers of a minimum length are eligible