4

This is a homework problem I’m trying to solve but I can’t seem to solve Q1b without using the theorem.

enter image description here

I am also given the fact that

$$E(y’Ay)=tr(A\Sigma)+\mu’A\mu$$

I’ve tried using the trace-expectation trick but to no avail. Assuming $y\sim N(0,I)$, $$\begin{align*} var(y’Ay) &=E[(y’Ay-tr(A))^2]\\ &=E(y’Ayy’Ay)-tr(A)^2\\ &=E(tr[y’Ayy’Ay])-tr(A)^2\\ &=E(tr[Ayy’Ayy’])-tr(A)^2\\ &=tr(E[Ayy’Ayy’])-tr(A)^2\\ &=tr(AE[yy’Ayy’])-tr(A)^2 \end{align*} $$ Then I’m stuck.

StubbornAtom
  • 11,143
  • 1
  • 28
  • 84

2 Answers2

6

Suppose $A=((a_{ij}))$ and $y=(y_1,y_2,\ldots,y_n)'\sim N(0,I_n)$, so that $y_i$s are i.i.d standard normal.

You already have $\operatorname E[y'Ay]=\operatorname{tr}(A)$ by the result you quoted (discussed here).

For the variance, you can simply use $$\operatorname{Var}(y'Ay)=\operatorname E[(y'Ay)^2]-(\operatorname E[y'Ay])^2\tag{1}$$

To compute the first expectation, we write the quadratic form as $y'Ay=\sum_{i,j}a_{ij}y_iy_j$ to get $$(y'Ay)^2=\sum_{i,j,k,l}a_{ij}a_{kl}y_iy_jy_ky_l \tag{2}$$

Now observe that $$\operatorname E[y_iy_jy_ky_l]=\begin{cases}3&,\text{ if }i=j=k=l \\ 1&,\text{ if }i=j,k=l;i=k,j=l;i=l;j=k \\ 0&,\text{ otherwise }\end{cases}$$

Therefore taking expectation on both sides of $(2)$,

$$\operatorname E[(y'Ay)^2]=3\sum_i a_{ii}^2+\sum_i\left(\sum_{k\ne i}a_{ii}a_{kk}+\sum_{j\ne i}a_{ij}^2+\sum_{j\ne i}a_{ij}a_{ji}\right)\tag{3}$$

Keeping in mind that $A$ is symmetric, you have $\operatorname{tr}(A^2)=\sum_{i,j}a_{ij}^2$.

It is now straightforward to see that $(3)$ reduces to $$\operatorname E[(y'Ay)^2]=(\operatorname{tr}(A))^2+2\operatorname{tr}(A^2)$$

From $(1)$ you get the desired result $$\boxed{\operatorname{Var}(y'Ay)=2\operatorname{tr}(A^2)}$$

You can now try to generalize this to the case $y\sim N(0,\Sigma)$ and hence to $y\sim N(\mu,\Sigma)$.

Reference:

  • Linear Regression Analysis by Seber and Lee.
StubbornAtom
  • 11,143
  • 1
  • 28
  • 84
  • 1
    Since $A$ is symmetric, the easier way to find the variance directly for the $N(0,I_n)$ case is to use spectral decomposition to write $A=P'\Lambda P$ for some orthogonal $P$ and $\Lambda=\operatorname{diag}(\lambda_1,\ldots,\lambda_n)$ where $\lambda_i$ are eigenvalues of $A$. Then $y'Ay=x'\Lambda x=\sum_{i=1}^n \lambda_i x_i^2$ with $x=Py\sim N(0,I_n)$. – StubbornAtom Jul 14 '20 at 16:12
2

As the passage from step (c) to step (d) is non-trivial, let me add a new answer to this interesting question here.

(b) Let $A = O'\Lambda O$ be the spectral decomposition of $A$, where $O$ is an order $n$ orthogonal matrix and $\Lambda = \operatorname{diag}(\lambda_1, \ldots, \lambda_n)$. Let $z = Oy$, then $z \sim N(0, I_n)$ as $y \sim N(0, I_n)$ and $O$ is orthogonal. It then follows $y'Ay = y'O'\Lambda Oy = z'\Lambda z = \sum\limits_{i = 1}^n\lambda_i z_i^2$ and the independence of $z_1^2, \ldots, z_n^2$ that \begin{align} \operatorname{Var}(y'Ay) = \sum_{i = 1}^n\lambda_i^2\operatorname{Var}(z_i^2) = \sum_{i = 1}^n\lambda_i^2(E[z_i^4] - (E[z_i^2])^2) = 2\sum_{i = 1}^n\lambda_i^2 = 2\operatorname{tr}(A^2). \tag{1} \end{align} In $(1)$, we used that $E[z_i^4] = 3$ and $E[z_i^2] = 1$ if $z_i \sim N(0, 1)$.

(c) If $y \sim N(0, \Sigma)$, we can write $y = \Sigma^{1/2}z$, where $z \sim N(0, I_n)$, it then follows by $(1)$ and the symmetry of $(\Sigma^{1/2})'A\Sigma^{1/2}$ that \begin{align} \operatorname{Var}(y'Ay) &= \operatorname{Var}(z'(\Sigma^{1/2})'A\Sigma^{1/2}z) = 2\operatorname{tr}((\Sigma^{1/2}A\Sigma^{1/2})^2) \\ &= 2\operatorname{tr}(\Sigma^{1/2}A\Sigma A\Sigma^{1/2}) = 2\operatorname{tr}(A\Sigma A\Sigma) = 2\operatorname{tr}((A\Sigma)^2). \tag{2} \end{align} In $(2)$, we used that $(\Sigma^{1/2})' = \Sigma^{1/2}$ and $\operatorname{tr}(M_1M_2) = \operatorname{tr}(M_2M_1)$ for order $n$ matrices $M_1, M_2$.

(d) If $y \sim N(\mu, \Sigma)$, we can write $y = \mu + z$, where $z \sim N(0, \Sigma)$, whence \begin{align} y'Ay = (\mu + z)'A(\mu + z) = \mu'A\mu + 2\mu'Az + z'Az. \end{align} Therefore, \begin{align} \operatorname{Var}(y'Ay) &= \operatorname{Var}(z'Az + 2\mu'Az + \mu'A\mu) = \operatorname{Var}(z'Az + 2\mu'Az) \\ &=\operatorname{Var}(z'Az) + 4\operatorname{Var}(\mu'Az) + 2\operatorname{Cov}(z'Az, 2\mu'Az). \tag{3} \end{align} By $(2)$, \begin{align} \operatorname{Var}(z'Az) = 2\operatorname{tr}((A\Sigma)^2). \tag{4} \end{align} In addition, \begin{align} \operatorname{Var}(\mu'Az) = \mu'A\operatorname{Var}(z)A\mu = \mu'A\Sigma A\mu. \tag{5} \end{align} To evaluate $\operatorname{Cov}(z'Az, 2\mu'Az) = 2E[z'Az\mu'Az]$, note that the expansion of $z'Az\mu'Az$ consists of only terms of forms (regardless constant coefficients) $z_i^3, 1 \leq i \leq n$ and $z_i^2z_j, 1 \leq i \neq j \leq n$, while for $z \sim N(0, \Sigma)$, it is easy to verify that \begin{align} & E[z_i^3] = 0, \quad 1 \leq i \leq n, \\ & E[z_i^2z_j] = 0, \quad 1 \leq i \neq j \leq n. \end{align} Therefore, \begin{align} \operatorname{Cov}(z'Az, 2\mu'Az) = 0. \tag{6} \end{align} Substituting $(4), (5), (6)$ into $(3)$, we obtain \begin{align} \operatorname{Var}(y'Ay) = 2\operatorname{tr}((A\Sigma)^2) + 4\mu'A\Sigma A\mu. \end{align} This completes the proof.

Zhanxiong
  • 18,524
  • 1
  • 40
  • 73