0

There are many sources on why a "low-order" ARMA$(p,q)$ model (with small but non-zero $p,q$) can be expressed, theoretically, as an AR$(\infty)$ model (or an MA$(\infty)$ as well). For example this question discusses this.

Now my question is more practical: I have some data and an auto-arima procedure on it reports that the lowest AIC model is ARMA$(2,1)$ (my data is stationary and I have beforehand verified this by carrying out the appropriate tests). The AIC value happened to be $X$. Now, I started fitting successive AR$(p)$ models for increasing $p$ until I reached an AIC of $X$ as well for AR$(13)$. All but 1 coefficient of the model were significant at $0.05$ significance level (and also the original ARMA$(2,1)$ had 1 insignificant coefficient at the same level).

Now, the question: is this procedure valid? Theoretically, I should go on until infinity. Practically, I am deciding to stop based on AIC. Is there something else I need to consider?

Any help would be greatly appreciated.

baibo
  • 113
  • 1
    What were the estimated parameters of the ARMA(2,1) model? – jbowman Dec 10 '19 at 20:38
  • You might also want to look at the answer to https://stats.stackexchange.com/questions/103368/why-is-an-arma-model-a-parsimonous-approximation-of-an-ar-model?noredirect=1&lq=1 for help deciding upon the cutoff for your AR($p$) model. – jbowman Dec 10 '19 at 20:40
  • 1
    I would be very skeptical about an AR(13) model. R's forecast::auto.arima() only fits models up to ARIMA(5,2,5), and the package authors have a lot of experience to base this decision on. – Stephan Kolassa Dec 10 '19 at 20:52
  • I am skeptical of an AR(13) as well, but I am doing this because in the situation that I have I cannot use an ARMA model (I want to use python's arch_model but it supports only AR mean procesess, not ARMA ones). I am looking for some reference to back what I wrote. But you are raising a valid point that the default parameters in R's auto-arima should be respected. – baibo Dec 10 '19 at 21:15
  • @jbowman here you go: const=0.0006, ar.L1.D.y=0.5128, ar.L2.D.y=0.1029, ma.L1.D.y=-0.9599 First and third insignificant at 0.05 level, second and fourth were significant. Also no unit roots (either AR or MA). – baibo Dec 10 '19 at 21:24
  • If you look at that MA parameter estimate, raising it to the 13th power still results in about 0.58 as the lag-13 AR parameter... although it isn't much of a contribution to $\sum_{i=1}^{\infty}\theta^i \approx 24$. This is clearly a case where the ARMA(2,1) representation is far more parsimonious than the AR($p$) representation would be. – jbowman Dec 10 '19 at 21:33
  • @jbowman I understood the =24 part, but can you explain why raising the MA coefficient in the ARMA(2,1) to the 13th power is the lag-13 AR pameter in the AR(13) model? – baibo Dec 10 '19 at 21:57
  • 1
    Look at the answer to https://stats.stackexchange.com/questions/103368/why-is-an-arma-model-a-parsimonous-approximation-of-an-ar-model?noredirect=1&lq=1. – jbowman Dec 10 '19 at 23:45

0 Answers0