1

I am looking at this model, which is used when the residuals of your typical least squares regression model is serially correlated. https://online.stat.psu.edu/stat510/lesson/8/8.1. I think on google, it is also called regression plus time series errors.

It seems like the resulting coefficients of the regression + arima model is pretty much the same as just your original regression model. As a result, is there a reason to even take this extra step to rebuild your model jointly with an ARIMA model for the residuals - since your coefficients are pretty much the same so I'd imagine prediction and inference to be the same as well?

Thanks

confused
  • 3,253

1 Answers1

1

A regular linear model requires independent observations, non-independent observation will have an effect on inference (SE, p-values, ...), not really on coefficients. For that reason you use an ARIMA on the residuals, to correct inference.

Edit: to expand a little on not really on coefficients, there may be some differences in the coefficients depending on whether you include an intercept in the model and how the ARIMA model is estimated, which is usually done with maximum likelihood and CSS (Conditional Sum of Squares) compared to a linear model which uses OLS; in this case as your sample size increases the differences should get smaller.

user2974951
  • 7,813