1

From some simulations I ran, it appears that, if $X$ is a random normal variable with zero mean, and $Y$ is a variable that is uncorrelated to $X$, then, if you sample $N$ values of both variables:

$S = \sum_{i=1}^N X_i Y_i$

and you repeat this several times, the mean of $S$ converges to $0$.

See below some R code:

N = 1000
Nr = 10000

Y = constant

m1 <- replicate(n = Nr,{ X = rnorm(n = N, mean = 0, sd = 1) Y = rep(3.6, N) sum(X*Y) })

hist(m1) print(mean(m1)) plot(1:Nr,cumsum(m1)/(1:Nr),type="l")

Y = random normal

m2 <- replicate(n = Nr,{ X = rnorm(n = N, mean = 0, sd = 1) Y = rnorm(n = N, mean = 2, sd = 2.5) sum(X*Y) })

hist(m2) print(mean(m2)) plot(1:Nr,cumsum(m2)/(1:Nr),type="l")

Y = uniform

m3 <- replicate(n = Nr,{ X = rnorm(n = N, mean = 0, sd = 1) Y = runif(n = N, min = -3, max = -0.3) sum(X*Y) })

hist(m3) print(mean(m3)) plot(1:Nr,cumsum(m3)/(1:Nr),type="l")

Y = linear function of X

m4 <- replicate(n = Nr,{ X = rnorm(n = N, mean = 0, sd = 1) Y = -1 + X / 3 sum(X*Y) })

hist(m4) print(mean(m4)) plot(1:Nr,cumsum(m4)/(1:Nr),type="l")

Is this expected?
Is there some theorem about it? (Or is it obvious?)
How would you write it? Is $E[\sum_{i=1}^N X_i Y_i] = 0$ a valid notation?
If you could please help or point me to some literature/post describing it, it would be great.

Thanks!

  • 1
    Because $XY$ has the same distribution as $-XY,$ the expectation of $XY$ must be zero (the unique finite number equal to its own negative), whence the expectation of $S$ is zero, too. This is one of the arguments presented in an analogous question at https://stats.stackexchange.com/questions/257859. Also see the related thread at https://stats.stackexchange.com/questions/46843. – whuber Sep 21 '20 at 19:38
  • Thank you @whuber ; how would you test or prove that $XY$ has 'the same distribution' as $-XY$ ? I would be able to say that $X$ has the same distribution as $-X$ because the standard normal density function is symmetrical (I can replace $x$ by $-x$ and the function is the same). But wouldn't that imply that any $Y$ would then be OK? Whereas from my example m4 in the R code we see that for some $Y$ the expectation is not 0. – user6376297 Sep 22 '20 at 07:27
  • That's a good argument--right up to the end, where it fails. Provided $XY$ has a finite expectation, when $X$ and $-X$ have the same distribution, perforce $E[XY]=E[(-X)Y]=-E[XY]$ demonstrates $E[XY]=0,$ no matter what the distribution of $Y$ might be. Here's efficient R code to demonstrate: n <- 1e6; (function(x) c(mean(x), sd(x)/sqrt(length(x)))) (rnorm(n) * runif(n,-3,-0.3)) It outputs the mean of a very large sample along with its standard error. – whuber Sep 22 '20 at 13:00
  • "right up to the end, where it fails" LOL :) Yes @whuber, that was precisely my point. I was not trying to make a correct argument, I was trying to understand how one can take two random variables $X$ and $Y$, and conclude that '$XY$ has the same distribution as $-XY$'. You use that as the starting point of your argument, which I have no doubt is correct, and I get the second part of it, also reading the other posts you linked. But if I don't understand why the distributions are the same in the first place, I cannot extend this logic to other cases. – user6376297 Sep 22 '20 at 17:27
  • I was implicitly assuming $X$ and $Y$ are independent. When that's the case, the symmetry of either $X$ or $Y$ suffices to prove symmetry of $XY.$ Otherwise, we have to be careful, because negating $X$ will--because of the lack of independence--change $Y$ as well. – whuber Sep 22 '20 at 17:30
  • Yes, sorry, $X$ and $Y$ are independent. OK, now I will try to understand how the symmetry of either of them implies the symmetry of their product :S Really out of my comfort zone, aren't I? Thank you again for your help. – user6376297 Sep 22 '20 at 17:32
  • Right, I see the problem was described here: https://en.wikipedia.org/wiki/Product_distribution ; but I find your approach more straightforward. – user6376297 Sep 22 '20 at 18:22

1 Answers1

0

From the definition of correlation, when you say "$Y$ is a variable that is uncorrelated to $X$", it is equivalent to say $E[XY] = E[X]*E[Y]$. Thus, $E[XY] = 0$ since $E[X]=0$ for $X$ is of zero mean.

  • Thanks! In fact this observation (that $E[XY] = E[X]*E[Y]$ when $X$ and $Y$ are uncorrelated) will be very useful for some calculations I am trying out. – user6376297 Sep 22 '20 at 16:05
  • Ouch https://en.wikipedia.org/wiki/Algebra_of_random_variables ¯\(ツ)/¯ But it is probably worth trying to understand what's behind the 'rules'. – user6376297 Sep 22 '20 at 18:13