0

I could use some help as im working through a practice/homework problem.

Let Let $X_1, X_2, ... X_n \overset{\text{iid}}\sim N(\mu,1)$, Find the UMVUE of $\theta = P(X_1 \leq c) $ where c is a known constant.

  1. Let $\mathbf{X}= (X_1, X_2, ... X_n)$. Verify that $\sigma({\mathbf{X}}) = I_{X_I \leq c}$ is an unbiased estimator of $\theta$. Express $\theta$ in terms of the cumulative distribution function of a standard normal random variable.

  2. Let $ \bar{X_n} = \frac{1}{n} \sum_{i=1}^n X_i$ prove that $(X_i - \bar{X_n})$ is an ancillary statistic. Then, using the results from this proof, establish that $(X_i - \bar{X_n})$ is independent of $ \bar{X_n}$

  3. Using parts (a) and (b), derive the UMVUE of $\theta$


My Attempt:

1: $E_{\theta}[I_{X_I \leq c}] = P(X_1 \leq c)$ therefore $\sigma({\mathbf{X}})$ is unbiased.

2: We start with $\bar{X}_n \sim N(\mu, \frac{1}{n})$, and we can show by the properties of normal distributions \begin{align} (X_i - \bar{X_n}) \sim N(0,s), \text{where s is some constant} \end{align}

because $(X_i - \bar{X_n})$ does not depend on $\mu$ it is considered and ancillary statistic.

3:

John
  • 1
  • 3
    The notation makes little sense, but the conclusion after "can I just say" is perfectly correct. However, since you seem uncertain about that, you ought to do a bit more to convince yourself that it's right. Use definitions and first principles--there's no sophistication here, only a need to make sure you understand the notation and concepts. – whuber Oct 31 '21 at 16:20
  • Thanks I edited 1 to reject my thoughts, and will work to convince myself it is right – John Oct 31 '21 at 16:26
  • Please add self-study tag. – Xi'an Oct 31 '21 at 16:37
  • 1
    Answered at https://stats.stackexchange.com/q/413264/119261 and its linked posts, though without specifically using the hints in your post. – StubbornAtom Oct 31 '21 at 16:45

0 Answers0