7

Let $$Y =\begin{pmatrix} Y_1 \\ Y_2 \end{pmatrix} \sim N(0, \Sigma) \quad \Sigma=\begin{pmatrix} \sigma_{11} & \sigma_{12}\\ \sigma_{21} & \sigma_{22} \end{pmatrix}$$

Show that $$\mathbb E(Y_1^2Y_2^2)=\sigma_{11}\sigma_{22} + 2\sigma_{12}^2. $$ I tried to turn this in all the ways but I need $\mathbb {Var}(Y_1Y_2)$ that I don't have. In the solution I was given, it only uses the fact that "the distribution is Gaussian".

StubbornAtom
  • 11,143
  • 1
  • 28
  • 84
Kilkik
  • 345

5 Answers5

8

Since all Gaussian variables have a kurtosis of $3,$ any Gaussian variable with variance $\sigma^2$ has a central fourth moment of $3\sigma^4.$ In particular, when $Z$ is a zero-mean Gaussian,

$$E[Z^4] = 3\sigma^4.$$

We will apply this below to the Gaussian variables $Z = Y_1\pm Y_2$ by observing (from the symmetry of $\Sigma$) that $$\operatorname{Var}(Y_1\pm Y_2) = \sigma_{11} + \sigma_{22} \pm 2\sigma_{12}.$$

Emulating the method I describe at https://stats.stackexchange.com/a/267021/919 (based on the Polarization Identity for $n=4$ with $x_1 = x_2 = Y_1$ and $x_3 = x_4 = Y_2$), write

$$12Y_1^2Y_2^2 = (Y_1+Y_2)^4 + (Y_1-Y_2)^4 - 2(Y_1^4) - 2(Y_2^4).$$

Use linearity of expectation to compute

$$\begin{aligned} 12 E[Y_1^2 Y_2^2] &= E[(Y_1+Y_2)^4] + E[(Y_1-Y_2)^4] - 2E[Y_1^4] - 2E[Y_2^4]\\ & = 3(\sigma_{11} + \sigma_{22} + 2\sigma_{12})^2 + 3(\sigma_{11} + \sigma_{22} - 2\sigma_{12})^2 - 2(3\sigma_{11}^2) - 2(3\sigma_{22}^2)\\ &= 12\sigma_{11}\sigma_{22} + 24\sigma_{12}^2. \end{aligned}.$$

Dividing both sides by $12$ gives the result in the question.

whuber
  • 322,774
8

Use the Characteristic function. It is well known that,

$$\varphi_Y(s) := \mathbb E\left[e^{is^\intercal Y}\right] = e^{-\frac12s^\intercal \Sigma s}.$$

So \begin{align} \frac{\partial^2 \varphi_Y}{\partial^2 s_1} = \mathbb E\left[-Y_1^2e^{is^\intercal Y}\right] = \frac{\partial}{\partial s_1}\left[\left(-\sigma_{11}s_1-\sigma_{12}s_2\right)e^{-\frac12s^\intercal \Sigma s}\right] = \left(-\sigma_{11} + \left(\sigma_{11}s_1 + \sigma_{12}s_2\right)^2\right)e^{-\frac12s^\intercal \Sigma s} \end{align}

Put $s_1 = 0$ and you will have:

$$-\mathbb E\left[Y_1^2 e^{is_2Y_2}\right] = \left(-\sigma_{11} + \sigma_{12}^2s_2^2\right)e^{-\frac12\sigma_{22}s_2^2}$$

and then,

$$\mathbb E\left[Y_1^2 Y_2^2e^{is_2Y_2}\right] = \frac{\partial^2}{\partial s_2^2}\mathbb E\left[-Y_1^2 e^{is_2Y_2}\right] = \frac{\partial}{\partial s_2}\left[\left(2\sigma_{12}^2s_2 +\sigma_{11}\sigma_{22}s_2 - \sigma_{12}^2 \sigma_{22}s_2^3\right)e^{-\frac12\sigma_{22}s_2^2}\right] = \left(2\sigma_{12}^2 + \sigma_{11}\sigma_{22} -3\sigma_{12}^2\sigma_{22}s^2_2-2\sigma_{12}^2\sigma_{22}s^3_2 - \sigma_{11}\sigma^2_{22}s^2_2 + \sigma_{12}^2 \sigma_{22}^2s_2^4\right)e^{-\frac12\sigma_{22}s_2^2}$$

Put $s_2 = 0$ and you will have:

$$\mathbb E\left[Y_1^2Y_2^2\right] = 2\sigma_{12}^2 + \sigma_{11}\sigma_{22}$$


It is worth it to mention that with this method you can compute all:

$$\mathbb E\left[Y_1^kY_2^\ell\right]$$

Kroki
  • 270
7

You can use the conditional distributions, which are univariate normal.

For example, $Y_2$ given $Y_1$ is normal with mean $$E\left[Y_2\mid Y_1\right]=\frac{\sigma_{12}}{\sigma_{11}} Y_1$$ and variance $$\operatorname{Var}\left[Y_2\mid Y_1\right]=\left(1-\frac{\sigma_{12}^2}{\sigma_{11}\sigma_{22}}\right)\sigma_{22}$$

So, by law of total expectation,

\begin{align} E\left[Y_1^2Y_2^2\right]&=E\,E\left[Y_1^2Y_2^2\mid Y_1\right] \\&=E\left[Y_1^2 E\left[Y_2^2\mid Y_1\right]\right] \\&=E\left[Y_1^2\left\{\operatorname{Var}\left[Y_2\mid Y_1\right]+(E\left[Y_2\mid Y_1\right])^2\right\}\right] \end{align}

You would need the fourth moment of $Y_1$, which is $E\left[Y_1^4\right]=3\sigma^2_{11}$.

StubbornAtom
  • 11,143
  • 1
  • 28
  • 84
5

Represent $Y_2$ by $\frac{\sigma_{12}}{\sigma_{11}}\cdot Y_1 + \sqrt{\sigma_{22} - \frac{\sigma_{12}^2}{\sigma_{11}^2} } \cdot Z$, where $Z \sim \mathop{\mathcal N}\left(0,1\right), Z \perp \!\!\! \perp Y_1.$
Now, $$ {Y_1}^2 \cdot \left( \frac{\sigma_{12}}{\sigma_{11}} \cdot Y_1 + \sqrt{\sigma_{22} - \frac{\sigma_{12}^2}{\sigma_{11}^2} } \cdot Z \right)^2 \\ = {Y_1}^4 \cdot \frac{\sigma_{12}^2}{\sigma_{11}^2} + 2 \cdot {Y_1}^3 \cdot Z \cdot \frac{\sigma_{12}}{\sigma_{11}} \cdot \sqrt{\sigma_{22} - \frac{\sigma_{12}^2}{\sigma_{11}^2} } + {Y_1}^2 \cdot Z^2 \cdot \left( \sigma_{22} - \frac{\sigma_{12}^2}{\sigma_{11}^2} \right). $$ Utilizing independence of $Y_1$ and $Z$ (and linearity) in calculating the expectation of the above term gives \begin{align} \mathop{\mathbb E}\left({Y_1}^2{Y_2}^2\right) &= \mathop{\mathbb E}\left({Y_1}^4 \right) \cdot \frac{\sigma_{12}^2}{\sigma_{11}^2} + \mathop{\mathbb E}\left(Z\right)\cdot \left(\dots\right) + \mathop{\mathbb E}\left({Y_1}^2\right) \cdot \mathop{\mathbb E}\left(Z^2\right) \cdot \left( \sigma_{22} - \frac{\sigma_{12}^2}{\sigma_{11}^2} \right) \\ &= 3 \sigma_{11}^2 \cdot \frac{\sigma_{12}^2}{\sigma_{11}^2} + 0 + \sigma_{11} \cdot 1 \cdot \left( \sigma_{22} - \frac{\sigma_{12}^2}{\sigma_{11}^2} \right) \\ &= \sigma_{11} \sigma_{22} + 2 \sigma_{12}^2. \end{align}

statmerkur
  • 5,950
5

One way to approach this problem is exploiting the Cholesky decomposition of $\Sigma$. Let $\Sigma = LL^\top$ be the Cholesky decomposition of $\Sigma$, where $L = \begin{bmatrix} a & 0 \\ b & c \end{bmatrix}$ is a lower triangular matrix. $Y$ then has the same distribution of $LX$, where $X = \begin{bmatrix} X_1 \\ X_2 \end{bmatrix} \sim N(0, I_{(2)})$. Therefore, $Y_1 \overset{d}{=} aX_1$, $Y_2 \overset{d}{=} bX_1 + cX_2$, whence \begin{align} & E[Y_1^2Y_2^2] = E[a^2X_1^2(b^2X_1^2 + 2bcX_1X_2 + c^2X_2^2)] \\ =& a^2b^2E[X_1^4] +2a^2bcE[X_1^3X_2] + a^2c^2E[X_1^2X_2^2] \\ =& 3a^2b^2 + a^2c^2. \tag{1}\label{1} \end{align} In the last step, we used $E[X_1^4] = 3, E[X_1^3X_2] = E[X_1^3]E[X_2] = 0$ and $E[X_1^2X_2^2] = E[X_1^2]E[X_2^2] = 1$, which are the results of $X \sim N(0, I_{(2)})$.

Next, expanding the decomposition $LL^\top = \Sigma$ and comparing entries on both sides yields $a^2 = \sigma_{11}$, $ab = \sigma_{12}$ and $b^2 + c^2 = \sigma_{22}$. Hence $\eqref{1}$ reduces to \begin{align} E[Y_1^2Y_2^2] = 2a^2b^2 + a^2(b^2 + c^2) = 2\sigma_{12}^2 + \sigma_{11}\sigma_{22}, \end{align} as desired.

Zhanxiong
  • 18,524
  • 1
  • 40
  • 73