89

We know the answer for two independent variables: $$ {\rm Var}(XY) = E(X^2Y^2) − (E(XY))^2={\rm Var}(X){\rm Var}(Y)+{\rm Var}(X)(E(Y))^2+{\rm Var}(Y)(E(X))^2$$

However, if we take the product of more than two variables, ${\rm Var}(X_1X_2 \cdots X_n)$, what would the answer be in terms of variances and expected values of each variable?

24n8
  • 1,137
damla
  • 1,041
  • 1
  • 9
  • 5
  • 9
    Because $X_1X_2\cdots X_{n-1}$ is a random variable and (assuming all the $X_i$ are independent) it is independent of $X_n$, the answer is obtained inductively: nothing new is needed. Lest this seem too mysterious, the technique is no different than pointing out that since you can add two numbers with a calculator, you can add $n$ numbers with the same calculator just by repeated addition. – whuber Mar 18 '13 at 23:46
  • 5
    Could you write out a proof of your displayed equation? I am curious to find out what happened to the $(E[XY])^2$ term which should give you some terms involving $\operatorname{cov}(X,Y)$. – Dilip Sarwate Mar 19 '13 at 00:46
  • 6
    @DilipSarwate, I suspect this question tacitly assumes $X$ and $Y$ are independent. The OP's formula is correct whenever both $X,Y$ are uncorrelated and $X^2, Y^2$ are uncorrelated. See my answer to a related question here. – Macro Mar 19 '13 at 01:53
  • 5
    @Macro I am well aware of the points that you raise. What I was trying to get the OP to understand and/or figure out for himself/herself was that for independent random variables, just as $E[X^2Y^2]$ simplifies to $$E[X^2Y^2]=E[X^2]E[Y^2]=(\sigma_X^2+\mu_X^2)(\sigma_Y^2+\mu_Y^2),$$ $E[(X_1\cdots X_n)^2]$ simplifies to $$E[(X_1\cdots X_n)^2]=E[X_1^2]\cdots E[X_n^2]=\prod_{i=1}^n(\sigma_{X_i}^2+\mu_{X_i}^2)$$ which I think is a more direct way of getting to the end result than the inductive method that whuber pointed out. – Dilip Sarwate Mar 19 '13 at 14:00
  • @DilipSarwate, nice. I suggest you post that as an answer so I can upvote it! – Macro Mar 19 '13 at 14:04
  • @Macro OK. I have added a few words of explanation too – Dilip Sarwate Mar 19 '13 at 14:29
  • @Dilip Your "more direct way" implicitly uses induction. I find it less direct insofar as it works with expectations of squares rather than variances, and therefore requires subsequent algebraic manipulations (which, once again, are implicit inductions). – whuber Mar 16 '19 at 16:41
  • https://link.springer.com/content/pdf/bbm:978-3-319-75465-9/1.pdf – Shahriar Mar 22 '23 at 21:06

1 Answers1

78

I will assume that the random variables $X_1, X_2, \cdots , X_n$ are independent, which condition the OP has not included in the problem statement. With this assumption, we have that $$\begin{align} \operatorname{var}(X_1\cdots X_n) &= E[(X_1\cdots X_n)^2]-\left(E[X_1\cdots X_n]\right)^2\\ &= E[X_1^2\cdots X_n^2]-\left(E[X_1]\cdots E[X_n]\right)^2\\ &= E[X_1^2]\cdots E[X_n^2] - (E[X_1])^2\cdots (E[X_n])^2\\ &= \prod_{i=1}^n \left(\operatorname{var}(X_i)+(E[X_i])^2\right) - \prod_{i=1}^n \left(E[X_i]\right)^2 \end{align}$$ If the first product term above is multiplied out, one of the terms in the expansion cancels out the second product term above. Thus, for the case $n=2$, we have the result stated by the OP. As @Macro points out, for $n=2$, we need not assume that $X_1$ and $X_2$ are independent: the weaker condition that $X_1$ and $X_2$ are uncorrelated and $X_1^2$ and $X_2^2$ are uncorrelated as well suffices. But for $n \geq 3$, lack of correlation is not enough. Independence suffices, but is not necessary. What is required is the factoring of the expectation of the products shown above into products of expectations, which independence guarantees.

Dilip Sarwate
  • 46,658
  • 1
    thanks a lot! I really appreciate it. Yes, the question was for independent random variables. – damla Mar 19 '13 at 19:32
  • Is it also possible to do the same thing for dependent variables? I am trying to figure out what would happen to variance if $$X_1=X_2=\cdots=X_n=X$$? Can we derive a variance formula in terms of variance and expected value of X? – damla Mar 26 '13 at 22:28
  • I have posted the question in a new page. Thanks a lot! http://stats.stackexchange.com/questions/53380/variance-of-powers-of-a-random-variable – damla Mar 26 '13 at 22:38
  • Dilip, is there a generalization to an arbitrary $n$ number of variables that are not independent? (This is a different question than the one asked by damla in their new question, which is about the variance of arbitrary powers of a single variable.) – Alexis Aug 07 '15 at 16:46
  • @Alexis To the best of my knowledge, there is no generalization to non-independent random variables, not even, as pointed out already, for the case of $3$ random variables. – Dilip Sarwate Aug 07 '15 at 18:33
  • @Alexis I withdraw my comment above. See this answer for the case of correlated random variables. – Dilip Sarwate Nov 19 '15 at 15:27
  • Why is $E[X_1^2\cdot \ldots X_n^2]=E[X_1^2]\cdot \ldots E[X_n^2]$? – Sibylse May 09 '23 at 13:34
  • @Sibylse If $Xi, \cdots, X_n$ are independent random variables, so are $g(X_1), \cdots, g(X_n)$ independent random variables for any (measurable) function $g(\cdot)$. – Dilip Sarwate May 09 '23 at 18:00
  • This is an awesome solution for those trying to solve the generic problem: what is the variance of raising X to the power of N, where X is a random variable and N is a constant. Just wanted to leave it here, in case someone is looking for the answer to that question. I did several simulations to verify that the math matches experimental results. I have seen it asked before, but cannot find the question right now.... I will point to this question in case I find it in the future. – amilkar0417 Jun 02 '23 at 05:38
  • @amilkar0417 The solution I present does not apply to the question of computing the variance of $X_n$ because the solution presented requires that the $n$ random variables be independent whereas $X,X,\cdots, X$ are not independent. See the comments by damla earlier in this thread. – Dilip Sarwate Jun 02 '23 at 23:42
  • Looks like there is an extra left paren in the second line of equations. Can't edit it because it's only one character :( – Rogach Sep 10 '23 at 13:34
  • @Rogach Thanks for your careful reading. I have removed the extra ( – Dilip Sarwate Sep 10 '23 at 13:52