4

Assume you have $A(L)y_t = B(L)e_t$ and $e_t$ is a zero mean white noise with variance $\sigma^2$. Why is the long-run variance of $y_t$ equal to $\sigma^2\left(\frac{B(1)}{A(1)}\right)^2$?

I know that the long-run variance is the infinite sum of all autocovariances of $y_t$ and that it can also be written as: $\gamma(0) + 2\sum_{j=1}^{\infty}\gamma(j)$ where $\gamma(0)$ is variance of $y_t$ and $\gamma(j)$ is j-th autocovariance $Cov(y_t,y_{t-j})$. But I struggle to reach this form $\sigma^2\left(\frac{B(1)}{A(1)}\right)^2$.

1 Answers1

2

By definition, the long-run variance is the infinite sum of all autocovariances of $y_t$:

$LRV(y_t) = \sum_{k=-\infty}^{\infty} \gamma_{j}$

Rewrite $y_t$ in its Wold representation: $y_t = A^{-1}(L)B(L)e_t = \Psi(L) e_t$,

Then

$\sum_{k=-\infty}^{\infty} \gamma_{j} = \sum_{k=-\infty}^{\infty} COV(\sum_{i=0}^{\infty}\psi_i e_{t-i},\sum_{j=0}^{\infty}\psi_j e_{t+k-j}) = $

$\sum_{k=-\infty}^{\infty} \sum_{i=0}^{\infty} \sum_{j=0}^{\infty} \psi_i \psi_j COV(e_{t-i},e_{t+k-j}) =$

$\sigma ^2 \sum_{k=-\infty}^{\infty} \sum_{i=0}^{\infty} \sum_{j=0}^{\infty} \psi_i \psi_j I(t-i = t + k - j) =$

$\sigma ^2 \sum_{k=-\infty}^{\infty} \sum_{i=0}^{\infty} \psi_i \psi_{k+i} =$

$\sigma ^2 \sum_{i=0}^{\infty} \sum_{k=-i}^{\infty} \psi_i \psi_{k+i} =$

$\sigma^2 (\sum_{i=0}^{\infty} \psi_i)^2 =$

$ \sigma^2 \psi(1)^2 = $

$\sigma^2 \frac{1+b_1 + ... + b_q}{1 - a_1 - ... - a_p} =$

$ \sigma^2 (\frac{B(1)}{A(1)})^2 $

Matt P
  • 644
  • 4
  • 9
  • Thank you for your answer, it is obviously helpful for the LRV of general linear processes, too. Could you elaborate on what you did in step 4 to 5? To be more precise, why does the starting index of the sum change when interchanging the sums? – Max Beikirch Sep 23 '19 at 10:35
  • 4
    If we take the unconditional variance for $AR(1)$ then it is given by $\gamma(0) = \sigma^2 / (1 - a_1^2)$ which doesn't seem to agree with this formula. Is this LVR the same as the unconditional variance of a stationary process or is it something else? Thank you – Confounded Nov 15 '20 at 15:03
  • Indeed. What are $b_1,\dots,b_q$ and $a_1,\dots,a_p$? – Richard Hardy Feb 01 '21 at 15:02
  • This answer https://stats.stackexchange.com/a/471296/234732 would point to this answer being incorrect. – David Veitch Oct 03 '22 at 19:50
  • I believe where this answer goes wrong is in asserting $\sum \psi_i^2=(\sum \psi_i)^2$ – David Veitch Oct 03 '22 at 20:01