0

I´m modeling with diffrent GARCH-Models the daily standard deviation of a stock market. That includes a rolling forecast model of the daily standard deviation. This works pretty well so far.

To compare the rolling daily standard deviation I have to calculate the daily standard deviation of my underlying time serie (the stock market). When I use the shortcut sd(x) I only get one value. I´m looking for a time serie of every daily standard deviation that allows me to compare it with every day of my rolling model.

Can anybody help me to model this in R?

Richard Hardy
  • 67,272

1 Answers1

0

You cannot obtain a meaningful estimate of the standard deviation for a particular day if you only have one observation for that day. You do not observe any variation at all, so there is no way to tell what the standard deviation is. A GARCH model employs some restrictive assumptions (essentially, that is what the model is made of) to be able to estimate conditional standard deviations for each day. Ideally, you would like to compare that to standard deviations obtained from a model that does not use restrictive assumptions. But since you only have one observation a day, that is impossible.

You can still evaluate the statistical adequacy of your GARCH model by looking at the standardized residuals. They should have zero autocorrelation at all nonzero lags (Ljung-Box test may be used for that, though it becomes problematic if the conditional mean model is ARMA). The same applies to squared standardized residuals (Li-Mak test can be used for that; ARCH-LM is not suitable, however). Also, the probability integral transform (PIT) of the standardized residuals should be Uniform[0,1] (Kolmogorov-Smirnov test can be used for that). A statistically adequate GARCH model must pass all three tests. You could come up with additional tests, but these three are the standard ones.

For R code, see the vignette and the reference manual of the rugarch package. Here are some examples by the author of the package.

Richard Hardy
  • 67,272