Let $X$ be a random vector in $\mathbb{R}^p$, where $p\geq 2$, with the following property: Any two uncorrelated marginals are independent. Formally:
(1) For any $\alpha,\beta\in \mathbb{R}^p$, if $Cov(\alpha^TX,\beta^TX)=0$ then $\alpha^TX$ and $\beta^TX$ are independent.
Is $X$ necessarily joint Gaussian?
A similar question has been asked here. Though that question is about a pair of random variables, while mine is about all marginals of a single random vector.
The Bernoulli example given there does not satisfy (1). Let $R:=(R_1,R_2)$ be a vector whose components are two independent symmetric Bernoulli random variables (uniformly distributed on $\{-1,+1\}$. Their covariance is: $$ E[\alpha^TR\beta^TR]=\alpha_1\beta_1+\alpha_2\beta_2=\alpha^T\beta $$ Now take $\alpha=(1,2)$ and $\beta=(-2,1)$. It holds that $\alpha^T\beta=0$. Yet, the event $\alpha^TR=3$ occurs if and only if $R=(1,1)$, which occurs if and only if $\beta^TR=-1$. So, the random vector $R$ does not satisfy (1).