I read from my textbook that $\text{cov}(X,Y)=0$ does not guarantee X and Y are independent. But if they are independent, their covariance must be 0. I could not think of any proper example yet; could someone provide one?
5 Answers
Easy example: Let $X$ be a random variable that is $-1$ or $+1$ with probability 0.5. Then let $Y$ be a random variable such that $Y=0$ if $X=-1$, and $Y$ is randomly $-1$ or $+1$ with probability 0.5 if $X=1$.
Clearly $X$ and $Y$ are highly dependent (since knowing $Y$ allows me to perfectly know $X$), but their covariance is zero: They both have zero mean, and
$$\eqalign{ \mathbb{E}[XY] &=&(-1) &\cdot &0 &\cdot &P(X=-1) \\
&+& 1 &\cdot &1 &\cdot &P(X=1,Y=1) \
&+& 1 &\cdot &(-1)&\cdot &P(X=1,Y=-1) \
&=&0. }$$
Or more generally, take any distribution $P(X)$ and any $P(Y|X)$ such that $P(Y=a|X) = P(Y=-a|X)$ for all $X$ (i.e., a joint distribution that is symmetric around the $x$ axis), and you will always have zero covariance. But you will have non-independence whenever $P(Y|X) \neq P(Y)$; i.e., the conditionals are not all equal to the marginal. Or ditto for symmetry around the $y$ axis.
Here is the example I always give to the students. Take a random variable $X$ with $E[X]=0$ and $E[X^3]=0$, e.g. normal random variable with zero mean. Take $Y=X^2$. It is clear that $X$ and $Y$ are related, but
$$Cov(X,Y)=E[XY]-E[X]\cdot E[Y]=E[X^3]=0.$$
- 193
- 35,099
-
I like that example too. As a particular case, a N(0,1) rv and a chi2(1) rv are uncorrelated. – ocram Jul 15 '11 at 08:21
-
8+1 but as a minor nitpick, you do need to assume that $E[X^3] = 0$ separately (it does not follow from the assumption of symmetry of the distribution or from $E[X] = 0$), so that we don't have issues such as $E[X^3]$ working out to be of the form $\infty - \infty$. And I am queasy about @ocram's assertion that "a N(0,1) rv and a chi2(1) rv are uncorrelated." (emphasis added) Yes, $X \sim N(0,1)$ and $X^2 \sim \chi^2(1)$ are uncorrelated, but not any $N(0,1)$ and $\chi^2(1)$ random variables. – Dilip Sarwate Feb 16 '12 at 03:09
-
@DilipSarwate, thanks, I've edited my answer accordingly. When I wrote it I though about normal variables, for them zero third moment follows from zero mean. – mpiktas Feb 16 '12 at 08:25
-
@Dilip Note, though, that the zero third moment is an immediate consequence of symmetry and a zero mean, making this example more general than it might appear. A distribution with these properties can be generated from any starting distribution with finite absolute third moment simply by symmetrizing it, giving arbitrarily rich examples. – whuber Mar 22 '23 at 19:34
-
@Dilip That's why I specified that the absolute third moment must be finite. – whuber Mar 23 '23 at 02:21
Some other examples, consider datapoints that form a circle or ellipse, the covariance is 0, but knowing x you narrow y to 2 values. Or data in a square or rectangle. Also data that forms an X or a V or a ^ or < or > will all give covariance 0, but are not independent. If y = sin(x) (or cos) and x covers an integer multiple of periods then cov will equal 0, but knowing x you know y or at least |y| in the ellipse, x, <, and > cases.
- 51,722
-
1That if should be "if x covers an integer multiple of periods beginning at a peak or trough", or more generally: "If x covers an interval on which y is symmetric" – naught101 Feb 16 '12 at 00:47
-
-
1@user1993, Look at the formula for covariance (or correlation). Then think about the circle/ellipse. Subtracting the means gives a circle centered on (0,0), so for every point on the circle you can reflect the point around the x-axis, the y-axis, and both axes to find a total of 4 points that will all contribute the exact same absolute value to the covariance, but 2 will be positive and 2 will be negative giving a sum of 0. Do this for all of the points on a circle and you will be adding together a bunch of 0's giving a total covariance of 0. – Greg Snow Nov 06 '19 at 20:51
Inspired by mpiktas's answer.
Consider $X$ to be a uniformly distributed random variable, i.e. $X \sim U(-1,1) $. Here, $$E[X] = (b+a)/2 = 0.$$ $$E[X^2] = \int_{-1}^{1} x^2 dx = 2/3$$ $$E[X^3] = \int_{-1}^{1} x^3 dx = 0$$
Since $Cov(X, Y) = E[XY] - E[X] \cdot E[Y]$, $$ Cov(X^2, X) = E[X^3] - E[X] \cdot E[X^2] \\ = 0 - 0 \cdot 2/3= 0 $$ Clearly $X$ and $X^2$ are not independent. But their covariance is computed to be zero. Since a counter example has been found, the proposition is false in general.
- 193

Its like asking 'Am I driving recklessly?' One question might be 'Are you travelling 25 mph over the speed limit?' But that isn't the only way to drive recklessly. Another question could be 'Are you drunk?' etc.. There is more than one way to drive recklessly.
– Adam Feb 16 '12 at 04:13