8

Let $\mathbf{X}=(X_1,X_2,\cdots, X_n)$ and $\mathbf{Y}=(Y_1,Y_2,\cdots, Y_m)$ be two random vectors. If each component of $\mathbf{X}$ is independent of $\mathbf{Y}$ can we say that $\mathbf{X}$ and $\mathbf{Y}$ are independent? In other words, if $X_i$ is independent of $Y_j$ for every $1\le i \le n$ and $1\le j\le m$ then are $\mathbf{X}$ and $\mathbf{Y}$ independent? If not, what about the special case when $\mathbf{X}$ and $\mathbf{Y}$ are Multivariate Normals?

  • 1
    Apparently, not. https://math.stackexchange.com/questions/1006256/are-two-random-vectors-independent-iff-every-pair-of-components-from-each-vecto – Greenparker Apr 23 '18 at 10:03
  • 1
    This is very closely related to the possibility that three variables might fail to be independent while any pair of them is independent: https://stats.stackexchange.com/questions/51322/does-independence-imply-conditional-independence/51325#51325. – whuber Apr 23 '18 at 14:25
  • @Greenparker Will it hold true if X and Y are multivariate normal distribution? – Supreeth Narasimhaswamy Apr 23 '18 at 15:15
  • 1
    Yes--and that is explicitly pointed out in the reference given by @Greenparker. – whuber Apr 24 '18 at 14:27

1 Answers1

1

In the general case, no; there is a distinction in probability theory between pairwise independence and mutual independence. In the special case where the two random vectors have a joint multivariate normal distribution, yes; the dependence between the random vectors is only through the second moment of the distribution, so pairwise independence between all pairs (or even pairwise uncorrelatedness) is sufficient to ensure mutual independence.

Ben
  • 124,856