When building a vector autoregression model is there some theory that would guide me in chosing the number of variable to include? For example, I have about 3000 data points and I would like to get an idea of how many lags of explanatory variables to ues.
Asked
Active
Viewed 516 times
1 Answers
6
The standard practice is to use some sort of information criterion, usually the Akaike Information Criterion (AIC) or the Schwarz Information Criterion (SIC).
The AIC is defined as $2k-2\log(L)$, where $k$ is the number of parameters, and L is the maximized likelihood function of the model (estimated with k parameters).
The SIC is defined as $k\log(n)-2\log(L)$, where $k$ and $L$ are as above, and $n$ is the number of observations of the model.
In general, one adopts the model that minimizes the chosen criterion.
Scortchi - Reinstate Monica
- 29,907
prototoast
- 546