To use AR/MA/ARMA models, the data needs to be stationary. Detrending data by taking differences between sequential data points seems to do this. However, does decomposing the data do the same thing? By separating data into seasonal and noise data, isn't the noise the same as the "detrended" data?
If we remove the trend in data, why do we even run models on the remaining stationary data? If we remove patterns why do we expect there still to be one that can be predicted by a model? Do we run machine learning models on this noise data?
- 111
1 Answers
There are several approaches to making a time series stationary. Differencing is one way to go, but it doesn't always work, take e.g. a time series with a quadratic trend. You can do repeated differencing but that can have issues, too.
Another approach is to find some smooth approximation of the time series and remove that from the time series. What remains then is a time series that can still have considerable autocorrelation. Consider e.g. a time series $x_t = y_t + z_t$ that is composed of a stationary time series $y_t$ and a linear trend $z_t = t$. So the time series analysis on the remainder, which would be $y_t$ after you have removed the linear trend $z_t$, often makes sense. A possible disadvantage here is that the smooth approximation already contains information about the time series and different approximations can lead to different remaining time series. This problem doesn't exist with differencing, here you don't have to fit any parameters.
- 10,797