I have a set of monthly time series data, and I would like to fit regression models with exogenous variables (price) to a response (sales volume). The errors are surely autocorrelated, and I need a way to account for this.
I am aware of the Cochrane-Orcutt procedure, and it seems there's a lot of wisdom there.
https://online.stat.psu.edu/stat501/lesson/t/t.2/t.2.4-examples-applying-cochrane-orcutt-procedure
However, I'm concerned that this method only uses only a first-order differencing.
There is noise in the data, and a first-order differencing ignores information at higher-order lags. There's a reason why exponential smoothing is such an effective forecasting method - because it incorporates higher order lags in a smoothed way.
There is likely a lag between the relationship of the exogenous variable and the response. If there is, say a 1 or 2 month lag, doesn't that destroy the relationship between first-order differenced variables?
I would appreciate any guidance on how to tackle this problem :)
Is there a more sophisticated version of Cochrane-Orcutt that includes a sort of exponential smoothing of higher-order lags? Is there a package for this? Can I hack it myself?
Do I need to go to time series model that can handle exogenous variables? The problem with this is that I actually have many time series in a sort of grouping & hierarchy, and I was hoping to fit a mixed effects model on the Cochrane-Orcutt-transformed data.
I am aware of packages lme4, glmmTMB, brms that allow for autocorrelated errors in mixed effects models. Is this the sort of solution I should pursue? Unfortunately, most of these also only allow for an AR(1) correlation structure to the errors.
nlmein R - it has a number of covariance patterns it can accommodate. https://stats.stackexchange.com/questions/633436/beyond-ar1-as-a-covariance-structure-for-mixed-models-with-repeated-measures – Erik Ruzek Jan 19 '24 at 18:52