Question: Suppose we have two random variable $X$, $Y$ that follow non-Gaussian distribution, and we are given that:
$$\operatorname { cum }(X, Y)=\operatorname { cum }(X, X, Y)=\operatorname { cum }(X, X, X, Y)=\cdots=0$$
Does this imply that $X$ and $Y$ are independent? If not, then are there any conditions under which this implication can be valid?
Notation: here we define cumulants among $k$ variables $Z _ { i _ { 1 } } , \ldots , Z _ { i _ { k } }$ as:
$$\operatorname { cum } \left( Z _ { i _ { 1 } } , \ldots , Z _ { i _ { k } } \right) = \sum _ { \left( A _ { 1 } , \ldots , A _ { L } \right) } ( - 1 ) ^ { L - 1 } ( L - 1 ) ! \mathbb { E } \left[ \prod _ { j \in A _ { 1 } } Z _ { j } \right] \mathbb { E } \left[ \prod _ { j \in A _ { 2 } } Z _ { j } \right] \cdots \mathbb { E } \left[ \prod _ { j \in A _ { L } } Z _ { j } \right],$$where the sum is taken over all partitions $\left( A _ { 1 } , \ldots , A _ { L } \right)$ of the set $\{i_1,\ldots,i_k\}$.
Materials that might help:
- The reverse direction is true (no matter Gaussian or not). Because any cumulant involving two (or more) independent random variables is zero. link
- In genral, "independence" of moments (i.e., $\mathbb { E }(X^n Y^m) = \mathbb { E }(X^n)\mathbb { E }(Y^m)$ for any $n,m$) does not imply independence, and respective conditions. link
- For a random non-Gaussian variable $X$, one may carefully choose parameters to make e.g., skewness=0, excess kurtosis=0. But it's impossible to make $X$'s all $n$-th ordinary cumulant being zero for all integers $n \geq 3$, since otherwise the moment generating function of $X$ would match a Gaussian. link