92
$\begingroup$

We know the answer for two independent variables: $$ {\rm Var}(XY) = E(X^2Y^2) − (E(XY))^2={\rm Var}(X){\rm Var}(Y)+{\rm Var}(X)(E(Y))^2+{\rm Var}(Y)(E(X))^2$$

However, if we take the product of more than two variables, ${\rm Var}(X_1X_2 \cdots X_n)$, what would the answer be in terms of variances and expected values of each variable?

$\endgroup$
8
  • 9
    $\begingroup$ Because $X_1X_2\cdots X_{n-1}$ is a random variable and (assuming all the $X_i$ are independent) it is independent of $X_n$, the answer is obtained inductively: nothing new is needed. Lest this seem too mysterious, the technique is no different than pointing out that since you can add two numbers with a calculator, you can add $n$ numbers with the same calculator just by repeated addition. $\endgroup$
    – whuber
    Commented Mar 18, 2013 at 23:46
  • 5
    $\begingroup$ Could you write out a proof of your displayed equation? I am curious to find out what happened to the $(E[XY])^2$ term which should give you some terms involving $\operatorname{cov}(X,Y)$. $\endgroup$ Commented Mar 19, 2013 at 0:46
  • 6
    $\begingroup$ @DilipSarwate, I suspect this question tacitly assumes $X$ and $Y$ are independent. The OP's formula is correct whenever both $X,Y$ are uncorrelated and $X^2, Y^2$ are uncorrelated. See my answer to a related question here. $\endgroup$
    – Macro
    Commented Mar 19, 2013 at 1:53
  • 5
    $\begingroup$ @Macro I am well aware of the points that you raise. What I was trying to get the OP to understand and/or figure out for himself/herself was that for independent random variables, just as $E[X^2Y^2]$ simplifies to $$E[X^2Y^2]=E[X^2]E[Y^2]=(\sigma_X^2+\mu_X^2)(\sigma_Y^2+\mu_Y^2),$$ $E[(X_1\cdots X_n)^2]$ simplifies to $$E[(X_1\cdots X_n)^2]=E[X_1^2]\cdots E[X_n^2]=\prod_{i=1}^n(\sigma_{X_i}^2+\mu_{X_i}^2)$$ which I think is a more direct way of getting to the end result than the inductive method that whuber pointed out. $\endgroup$ Commented Mar 19, 2013 at 14:00
  • $\begingroup$ @DilipSarwate, nice. I suggest you post that as an answer so I can upvote it! $\endgroup$
    – Macro
    Commented Mar 19, 2013 at 14:04

1 Answer 1

80
$\begingroup$

I will assume that the random variables $X_1, X_2, \cdots , X_n$ are independent, which condition the OP has not included in the problem statement. With this assumption, we have that $$\begin{align} \operatorname{var}(X_1\cdots X_n) &= E[(X_1\cdots X_n)^2]-\left(E[X_1\cdots X_n]\right)^2\\ &= E[X_1^2\cdots X_n^2]-\left(E[X_1]\cdots E[X_n]\right)^2\\ &= E[X_1^2]\cdots E[X_n^2] - (E[X_1])^2\cdots (E[X_n])^2\\ &= \prod_{i=1}^n \left(\operatorname{var}(X_i)+(E[X_i])^2\right) - \prod_{i=1}^n \left(E[X_i]\right)^2 \end{align}$$ If the first product term above is multiplied out, one of the terms in the expansion cancels out the second product term above. Thus, for the case $n=2$, we have the result stated by the OP. As @Macro points out, for $n=2$, we need not assume that $X_1$ and $X_2$ are independent: the weaker condition that $X_1$ and $X_2$ are uncorrelated and $X_1^2$ and $X_2^2$ are uncorrelated as well suffices. But for $n \geq 3$, lack of correlation is not enough. Independence suffices, but is not necessary. What is required is the factoring of the expectation of the products shown above into products of expectations, which independence guarantees.

$\endgroup$
12
  • 1
    $\begingroup$ thanks a lot! I really appreciate it. Yes, the question was for independent random variables. $\endgroup$
    – damla
    Commented Mar 19, 2013 at 19:32
  • $\begingroup$ Is it also possible to do the same thing for dependent variables? I am trying to figure out what would happen to variance if $$X_1=X_2=\cdots=X_n=X$$? Can we derive a variance formula in terms of variance and expected value of X? $\endgroup$
    – damla
    Commented Mar 26, 2013 at 22:28
  • $\begingroup$ I have posted the question in a new page. Thanks a lot! stats.stackexchange.com/questions/53380/… $\endgroup$
    – damla
    Commented Mar 26, 2013 at 22:38
  • $\begingroup$ Dilip, is there a generalization to an arbitrary $n$ number of variables that are not independent? (This is a different question than the one asked by damla in their new question, which is about the variance of arbitrary powers of a single variable.) $\endgroup$
    – Alexis
    Commented Aug 7, 2015 at 16:46
  • $\begingroup$ @Alexis To the best of my knowledge, there is no generalization to non-independent random variables, not even, as pointed out already, for the case of $3$ random variables. $\endgroup$ Commented Aug 7, 2015 at 18:33

Not the answer you're looking for? Browse other questions tagged or ask your own question.