4

I have a multivariate AR(1) process (first-order vector autoregression, VAR(1)) of the form $$ \pmb X_{t+1} = A \pmb X_t + \zeta_t $$ where $\pmb X_t$ is a vector, $A$ is a matrix and $\zeta_t \sim N(0,\Sigma)$, and independent of $\pmb X_t$.

I'm trying to find the variance of $\pmb X_t$.

\begin{align} \text{Var}( \pmb X_{t+1} ) & = A \text{Var}( \pmb X_t ) A^T + \Sigma \\ & = \Sigma + A \Sigma A^T + A^2 \Sigma (A^2)^T + ... \end{align}

I know ordinarily at this point we would use the properties of geometric series to simplify this expression. But in this case since there is a matrix power on either side I'm not sure how to simplify the expression since the matrix equivalent of a geometric series does not apply.

Richard Hardy
  • 67,272
Jack
  • 323

1 Answers1

3

First, find the multivariate Yule-Walker representation $$ \Gamma_j=\begin{cases} A\Gamma_1^\top+\Sigma&\text{for }j=0\\ A\Gamma_{j-1}&\text{for }j>0 \end{cases} $$ We then use this to show that $$ \text{vec}(\Gamma_0)=(I_{n^2}-A\otimes A)^{-1}\text{vec}(\Sigma) $$ and $$ \text{vec}(\Gamma_j)=(I_{n}\otimes A)^{j}\text{vec}(\Gamma_0) $$ and hence $$ \text{vec}(\Gamma_j)=(I_{n}\otimes A)^{j}(I_{n^2}- A\otimes A)^{-1}\text{vec}(\Sigma) $$

The following matrix algebraic fact is useful: $$\text{vec}(ABC)=(C^\top\otimes A)\text{vec}(B)\qquad(*)$$ for conformable $A,B,C$.

The result for $j=0$ can be shown via \begin{eqnarray*} \Gamma_0 &=& E(\mathbf{X}_t - \mathbf{\mu})(\mathbf{X}_t - \mathbf{\mu})^\top\\ &=& E(\mathbf{X}_t - \mathbf{\mu})( A(\mathbf{X}_{t-1}-\mathbf{\mu})+\zeta_t)^\top\\ &=& E(\mathbf{X}_t - \mathbf{\mu})((\mathbf{X}_{t-1}-\mathbf{\mu})^\top A^\top+\zeta_t^\top)\\ &=& E\left[(\mathbf{X}_t - \mathbf{\mu})(\mathbf{X}_{t-1}-\mathbf{\mu})^\top\right] A^\top + \underbrace{E\left[(\mathbf{X}_t - \mathbf{\mu})\zeta_t^\top\right]}_{=E\left[\sum_{j=0}^\infty A^j\zeta_{t-j}\zeta_t^\top\right]}\\ &=& \Gamma_1 A^\top + \Sigma \end{eqnarray*} For $j>0$, \begin{eqnarray*} \Gamma_j &=& E(\mathbf{X}_{t} - \mathbf{\mu})(\mathbf{X}_{t-j} - \mathbf{\mu})^\top\\ &=& E( A(\mathbf{X}_{t-1}-\mathbf{\mu})+\zeta_t)(\mathbf{X}_{t-j} - \mathbf{\mu})^\top\\ &=& AE(\mathbf{X}_{t-1}-\mathbf{\mu})(\mathbf{X}_{t-j} - \mathbf{\mu})^\top+ A\underbrace{E(\zeta_t(\mathbf{X}_{t-j} - \mathbf{\mu})^\top)}_{=0}\\ &=& A \Gamma_{j-1} \end{eqnarray*} Inserting $\Gamma_1= A\Gamma_{0}$ in the result for $\Gamma_{0}$ gives $$ \Gamma_{0}= A\Gamma_0^\top A^\top+\Sigma= A\Gamma_0 A^\top+\Sigma $$ such that (note $\text{vec}(A+B)=\text{vec}(A)+\text{vec}(B)$) $\text{vec}(\Gamma_{0})=\text{vec}( A\Gamma_0 A^\top)+\text{vec}(\Sigma)$ and, by the result on vec operators, $\text{vec}(\Gamma_{0})=( A\otimes A)\text{vec}(\Gamma_{0})+\text{vec}(\Sigma)$. Hence, $$ \text{vec}(\Gamma_{0})-( A\otimes A)\text{vec}(\Gamma_{0})=\text{vec}(\Sigma) $$ or $$ \text{vec}(\Gamma_0)=(I_{n^2}- A\otimes A)^{-1}\text{vec}(\Sigma). $$

For $\Gamma_j$, write $\Gamma_j = A^j\Gamma_{0}$ as $\Gamma_j = A^j\Gamma_{0}I_n$ and apply the above matrix result (*) to get $$ \text{vec}(\Gamma_j)=(I_{n}\otimes A^j)\text{vec}(\Gamma_0)=(I_{n}\otimes A)^j\text{vec}(\Gamma_0) $$ The second representation for $\text{vec}(\Gamma_j)$ is obtained simply by inserting.

This boils down to the well-known expression in the univariate AR(1) case, with $n=1$, $A=a$, $\Sigma=\sigma^2$, so that $$\text{vec}(\Gamma_j)=(I_{n}\otimes A)^{j}(I_{n^2}- A\otimes A)^{-1}\text{vec}(\Sigma)\Rightarrow\gamma_j=(1\otimes a)^{j}(1- a\otimes a)^{-1}\sigma^2$$ or $$\gamma_j=a^{j}(1- a^2)^{-1}\sigma^2$$