4

I am interested in proving the delta method, where we show that

$$\sqrt{n}(g(Y_n) - g(\theta)) \overset{\text{Dist}}{\to} \text{N}(0, \sigma^2 g'(\theta)^2).$$

We use Taylor expansion where $$g(Y_n) = g(\theta) + g'(\theta)(Y_n-\theta) + \text{Remainder}$$ By showing that $Y_n \to \theta$ in probability, we say that that $\text{Remainder} \to 0$. However, if $Y_n \to \theta$ in probability, shouldn't the first order term also go to zero? How are we able to set only the remainder to zero without setting other terms to zero?

Ben
  • 124,856
MoneyBall
  • 907
  • Yes that is a result of the continuous mapping theorem. – AdamO Jun 03 '18 at 12:24
  • Hi could you explain wha continuous mapping theorem is and how it was applied here? – MoneyBall Jun 03 '18 at 12:56
  • https://en.wikipedia.org/wiki/Continuous_mapping_theorem it is a direct application of exactly what you say of the 1st term. – AdamO Jun 03 '18 at 12:59
  • @AdamO - might want to expand that to an answer, since it is! – jbowman Jun 03 '18 at 14:24
  • I'm still confused on how remainder goes to 0. If $Y_n \to \theta$ then remainder which is $g(Y) - g(Y_n)$ goes to 0 but shouldn't $g'(\theta)(Y_n-\theta)$ go to zero as well? – MoneyBall Jun 04 '18 at 03:38
  • @MoneyBall that's a different question. In fact it's not a statistical question, it's a real-analysis question. You can post on math.stackexchange.com. There are rigorous delta-epsilon proofs of the convergence of Taylor-series. The precision of the approximation depends on the smoothness of the function. The remainder does not go to 0 for very wiggly (non-analytic) functions. – AdamO Jun 04 '18 at 19:29
  • Remember that the delta method is a vestige of frequentist statistics. It's not needed for Bayesian modeling, which provides exact inference for any derived quantity as a byproduct of posterior sampling. – Frank Harrell Jul 02 '19 at 11:51

1 Answers1

1

I think that the result you are after is that in

$$\text{var}(g(Y_n)) \approx \text{var}(g(\theta) + g^\prime(\theta) (Y_n - \theta))$$

$g(\theta)$ is a constant so its variance is 0. It's not an asymptotic result, but just the definition of variance.

AdamO
  • 62,637