6

Let $X$ be a random vector in $\mathbb{R}^p$, where $p\geq 2$, with the following property: Any two uncorrelated marginals are independent. Formally:

(1) For any $\alpha,\beta\in \mathbb{R}^p$, if $Cov(\alpha^TX,\beta^TX)=0$ then $\alpha^TX$ and $\beta^TX$ are independent.

Is $X$ necessarily joint Gaussian?


A similar question has been asked here. Though that question is about a pair of random variables, while mine is about all marginals of a single random vector.

The Bernoulli example given there does not satisfy (1). Let $R:=(R_1,R_2)$ be a vector whose components are two independent symmetric Bernoulli random variables (uniformly distributed on $\{-1,+1\}$. Their covariance is: $$ E[\alpha^TR\beta^TR]=\alpha_1\beta_1+\alpha_2\beta_2=\alpha^T\beta $$ Now take $\alpha=(1,2)$ and $\beta=(-2,1)$. It holds that $\alpha^T\beta=0$. Yet, the event $\alpha^TR=3$ occurs if and only if $R=(1,1)$, which occurs if and only if $\beta^TR=-1$. So, the random vector $R$ does not satisfy (1).

  • 1
    A step in the right direction is the Herschel-Maxwell Theorem. It states that if the distribution of $X$ is rotationally invariant in addition to (1), then the components of $X$ are normal. – Idontgetit Sep 04 '23 at 08:39
  • Are the marginal distributions of $X$ assumed to by Gaussian? this is the key for answering here – Spätzle Sep 05 '23 at 11:21
  • @Spätzle, the only assumption is (1). So, the marginals are not assumed to be Gaussian. – Idontgetit Sep 05 '23 at 15:05
  • The thing here with Herschel-Maxwell is that it requires all components to be independent. Here, you only know that "Any two uncorrelated marginals are independent", not that all of the components are independent. – Spätzle Sep 06 '23 at 10:05
  • Also, there's something strange with the definition - the marginals of $X$ should be $x_1,x_2,$ etc. If you're going for two marginals being uncorrelated, that would be a linear combination with scalars, e.g. $ax_1+bx_2$. Using the $Cov$ operator on $\alpha^TX,\beta^TX$ is in fact the autocorrelation function of the vector and has little to do with the marginals themselves, and we can do almost nothing with the joint distro of $\alpha^TX,\beta^TX$. – Spätzle Sep 06 '23 at 13:15
  • @Spätzle Thank you for your replies. We may take $\alpha=e_1=(1,0,\ldots,0)$, in which case $\alpha^TX=x_1$. Consequentially, (1) implies that the components are independent. If you don't like my use of the word 'marginal' here, please ignore it. I only want to know whether (1) implies that $X$ is joint Gaussian. – Idontgetit Sep 06 '23 at 13:31
  • I think I wasn't clear enough: the term $\alpha^TX$ is de facto a weighted sum of the components. what (1) says is "Given two such sums ($\alpha^TX,\beta^TX$), if they are found to be uncorrelated then they are also independent". Then again for any realization of $X$ both $\alpha$ and $\beta$ would be multiplied by the same values - meaning a lot is uo to the choice of the weights. – Spätzle Sep 06 '23 at 14:12
  • If you could provide some wide context (or, if it's homework - from which book/course?) that would be great because it feels like some background details are missing here. – Spätzle Sep 06 '23 at 14:13
  • I am using property (1) in a paper I'm working on. So far I assumed that $X$ is Gaussian, but I am curious whether other distributions also satisfy (1). – Idontgetit Sep 06 '23 at 14:28

2 Answers2

2

The answer is yes. This is a consequence of the Darmois-Skitovich theorem. It states that if $X_1,\ldots,X_p$ are independent and $\alpha_i,\beta_i\neq 0$ for all $i\in\{1,\ldots,p\}$, then $\alpha^TX$ and $\beta^TX$ are independent only if all $X_i$ are Gaussian.

Suppose (1) is true. Note that without loss of generality, we may assume $\mathbb{E}[X]=0$ and $\mathbb{E}[XX^T]=I$, so that $Cov(\alpha^TX,\beta^TX)=\alpha^T\beta$. First, choose $\alpha,\beta$ as the coordinate unit vectors to show $X_1,\ldots,X_p$ are independent by (1). Next, choose any orthogonal $\alpha,\beta$ with nonzero coefficients. By (1), $\alpha^TX$ and $\beta^TX$ are independent. So, by the Darmois-Skitovich theorem, $X_1,\ldots,X_p$ are Gaussians. As $X_1,\ldots,X_p$ are independent Gaussians, $X$ is joint Gaussian.

1

I can't quite proof it, but i think that yes, $X$ has to be Gaussian. Also as a point of nomenclature: I'd call $\alpha^TX$ a linear combination.

Now let's say $E[X] = \vec 0$ and $Cov(X) = \Sigma$, with $\Sigma$ being symmetric and positive definite, i.e. non degenerate. Then we can decomposition $$\Sigma = V \begin{pmatrix} \lambda_{1} & &0 \\ & \ddots & \\ 0& & \lambda_{p} \end{pmatrix} V^T $$ $V$ being an orthogonal matrix of eigenvectors, and $\lambda$ representing the eigenvalues, This then allows us to define the also symmetric and positive definite: $$\Sigma^{-1/2} = V \begin{pmatrix} \frac{1}{\sqrt\lambda_{1}} & & 0\\ & \ddots & \\ 0& & \frac{1}{\sqrt\lambda_{p}} \end{pmatrix} V^T $$

Now consider $Z := \Sigma^{-1/2}X$, with $$ Cov(Z) = Cov(\Sigma^{-1/2}X) = (\Sigma^{-1/2})^T \Sigma \Sigma^{-1/2} = I_p $$

Since each component $Z_i$ of $Z$ is just linear combination of $X$ they are all independent by condition (1). Also for any orthogonal matrix $H$, i.e. any kind of rotation, $Cov(HZ) = H^TI_pH = H^TH = I_p$, so we have "rotational independence".

I can't rigorously proof that rotational symmetry follows from rotational independence, but non rigorously, any non rotational symmetric "feature" of the joint distribution would have to be parallel to an axis and this is a property that does not survive rotation.

If we accept that argument then by Herschel Maxwell $Z\sim \mathcal N (\vec 0, I_p)$ and therefore $\Sigma^{1/2}Z = X \sim \mathcal N (\vec 0, \Sigma)$.

Lukas Lohse
  • 2,482
  • Thank you for your answer Lukas! Indeed, we can assume mean zero and identity covariance without loss of generality. Identity covariance matrix and independent components does not imply rotational invariance, however, the symmetric Bernoulli distribution is a counterexample. So, unfortunately the argument does not work. – Idontgetit Sep 11 '23 at 08:09
  • It's not just $Z$ that is independent but each rotation of $Z$, $Y = HZ = H\Sigma^{-1/2}X$ is independent as well. Take two iid Bernoulli on ${-1, 1}$ and rotate it by 45 degrees: Now $P(Y_2 = 0) = 0.5$, but $P(Y_2 = 0|Y_1 = \sqrt{2}) = 1$ – Lukas Lohse Sep 11 '23 at 08:21
  • I guess if $p = 1$, then yeah. There are counterexamples, but that is not that interesting. – Lukas Lohse Sep 11 '23 at 08:25