I have a multiple linear model for time series data for which the regression residuals are autocorrelated and display seasonal behavior. This seasonal behavior is induced deterministically by a cyclic variable written into the model. In order to calculate corrected standard errors for the regression coefficients, I intend to use generalized least squares with correction for the autocorrelation in the residuals.
But for the seasonality, I am not sure for which time series I should perform the differencing: If for raw original data (observations), or for the regression residuals. For the data, I could superimpose each cycle and take the mean, thus removing season effects, but this would alter the data, and also would generalize bad for other kinds of data. For the regression residuals, I dont know if I have to difference the residuals time series substracting for each period, or to handle it as some kind of seasonal ARMA process.
Any hints?
LAST EDIT: The problem behind this question might have been resolved already, and might have been product of a misconception.
Regression was being done over simulations of the model. These simulations did not contain any stochastic error factor, so the simulations were purely of a deterministic nature. Regression errors were just showing an unperfect fit to the data and the regression residuals were obviously following a deterministic pattern. This pattern was not a result of some neglected explanatory variable in the model, and thus there was no reason to model it (by means of an ARMA model or any kind). When adding white noise to the data -which should be a crucial step on any simulation of real data- regression residuals were mostly dominated by stochasticity, loosing the autocorrelated behavior.
The last paragraph answers my question. But is there a reason why I shouldnt remove seasonality from the data thus then only treating the residuals as an AR model?
– hirschme Jul 29 '16 at 10:33