2

We know that $cov(X,Y)=0$ does not warranty $X$ and $Y$ are independent. But if they are independent, their covariance must be $0$.

My question is: what kind of distribution must $X$ and $Y$ be for there to be a proof that " $cov(X,Y)=0$ -> that $X$ and $Y$ are independent" ? (I mean how to prove "if $cov(X,Y)=0$ so $X$ and $Y$ are independent?)

Peter Flom
  • 119,535
  • 36
  • 175
  • 383

1 Answers1

1

Early analysis of this question can be found in Lancaster (1951) and Leipnik (1961). In the latter paper the author analyses the conditions required for uncorrelatedness to imply independence in a bivariate continuous distribution. If you have access to scholarly journals (e.g., through a university) then I recommend starting with these papers. I will also give a bit of insight for a special case.


It is worth noting that independence of $X$ and $Y$ is equivalent to the following moment condition:

$$\mathbb{E}(h(X) g(Y)) = \mathbb{E}(h(X)) \mathbb{E}(g(Y)) \quad \text{for all bounded measureable } g \text{ and } h. $$

If both random variables have bounded support then the Stone-Weierstrass theorem allows us to uniformly approximate the functions $g$ and $h$ with polynomials, which gives the equivalent condition:

$$\mathbb{E}(X^n Y^m) = \mathbb{E}(X^n) \mathbb{E}(Y^m) \quad \text{for all } n \in \mathbb{N} \text{ and } m \in \mathbb{N}. \quad \quad \quad$$

(The case $n=m=1$ implies zero correlation, and the other cases are conditions for higher-order moment separability.) Thus, for the case of random variables with bounded support, correlation of zero, plus higher-order moment separability at all orders is equivalent to independence.

Ben
  • 124,856