0

Let $\hat{\theta}_i$ $(i = 1, 2)$ be two unbiased estimators of $\theta \in \mathbb{R}$, which are uncorrelated, with $V(\hat{\theta}_i) = \sigma^2_i > 0$ $(i = 1, 2)$. For $\alpha \in [0, 1]$, define $\hat{\theta} = \hat{\theta}(\alpha) = \alpha\hat{\theta}_1 + (1 − \alpha)\hat{\theta}_2$.

What is the quickest way to find a value of $\alpha$ which minimizes $V[\hat{\theta}(\alpha)]$ and to show that we have an unbiased estimator?

whuber
  • 322,774
  • All the techniques you need are exhibited in the closely related question at http://stats.stackexchange.com/questions/5392. – whuber Nov 08 '16 at 16:41
  • But how do you develop a clean proof of that? – Bonsaibubble Nov 08 '16 at 17:05
  • One way uses Lagrange Multipliers, as illustrated in a closely related situation. Another uses ordinary Calculus techniques--after all, $V(\hat\theta(\alpha))$ is a differentiable function of $\alpha$. The most elementary method observes that $$V(\hat\theta(\alpha))=(\sigma_1^2 + \sigma_2^2)\left(\alpha - \frac{\sigma_2^2}{\sigma_1^2+\sigma_2^2}\right)^2 +\frac{\sigma_1^2\sigma_2^2}{\sigma_1^2+\sigma_2^2},$$ which has an obvious unique minimum. – whuber Nov 08 '16 at 17:14

1 Answers1

1

What I always liked about this situation is its very illuminating intuition. We have, given unocorrelatedness,

$$\text{Var} [\hat \theta (\alpha))] = \alpha^2 \sigma^2_1 + (1-\alpha)^2 \sigma^2_2$$

Taking the derivative with respect to $\alpha$ and setting it equal to zero we have the condition

$$2\alpha\sigma^2_1 -2(1-\alpha) \sigma^2_2 = 0 \implies \alpha^* = \frac {\sigma^2_2}{\sigma^2_1 + \sigma^2_2}$$

while the second derivative is positive. So this is indeed a minimizer.

So the higher the variance of estimator No 2, the higher the weight given to the other estimator.

As regards unbiasedness, it comes immediately from the linearity property of the expected value, irrespective of whether we use the optimal $\alpha^*$ or not. Proving the linearity property in turn is proving the additive property in integration.