2

Consider a linear model with normally distributed, autocorrelated errors \begin{aligned} y&=X\beta+\varepsilon \\ \varepsilon&\sim N(0,\sigma^2_{\varepsilon}) \text{ and autocorrelated.} \end{aligned}

Say, $\varepsilon\sim\text{AR(1)}$. There are several ways of estimating the model's parameters. Consider generalized method of moments (GMM) and (conditional or full) maximum likelihood (ML).

(AR(1) is chosen for simplicity and analytical tractability; ARMA(p,q) is also of interest. Normality is chosen for greater similarity and easier comparability between GMM and ML estimators.)

What are the main differences between these estimators?
When should we choose one over the other?

Richard Hardy
  • 67,272
  • Good sources are Hayashi "Econometrics" p. 406-417 for GMM and Stine's lecture notes on ML. Regarding algebraic differences: the formulas are there, just need to work out the messy details... Comments on practical differences and choice between the two are very welcome. – Richard Hardy Sep 23 '19 at 19:01
  • For your GMM are you adding the lagged response as a covariate in the model? – AdamO Sep 23 '19 at 19:21
  • @AdamO, no. The model is $y=X\beta+\varepsilon$ where $\varepsilon\sim N(0,\sigma^2_{\varepsilon})$ and autocorrelated (AR(1)). The GMM point estimate is just the $\hat\beta_{OLS}$ as per Hayashi, just the GMM variance estimate is altered from OLS estimate to account for autocorrelation in errors. – Richard Hardy Sep 23 '19 at 20:07
  • @AdamO, not to say that this is good or bad, just trying to stick to a familiar setup. – Richard Hardy Sep 23 '19 at 20:14

0 Answers0