2

I have this example of sufficiency:

Let $Y_1, \dots, Y_n$ be i.i.d. $N(\mu, \sigma^2)$. Note that $\sum_{i = 1}^n (y_i - \mu)^2 = \sum_{i = 1}^n (y_i - \bar{y})^2 + n(\bar{y} - \mu)^2$. Hence

$$\begin{align} L(\mu, \sigma; \mathbf{y}) &= \prod_{i = 1}^n \dfrac{1}{\sqrt{2\pi \sigma^2}}e^{-\frac{1}{2\sigma^2}(y_i - \mu)^2} \\ &= \dfrac{1}{(2\pi \sigma^2)^{n/2}}e^{-\frac{1}{2\sigma^2}\sum_{i = 1}^n (y_i - \bar{y})^2}e^{-\frac{1}{2\sigma^2}n(\bar{y} - \mu)^2} \end{align}$$

From Theorem 1, it follows that where $T(\mathbf{Y}) = (\bar{Y}, \sum_{i = 1}^n (Y_i - \bar{Y})^2)$ is a sufficient statistic for $(\mu, \sigma)$.

It then says the following:

We now show that $\bar{Y} \sim N(\mu, \frac{\sigma^2}n)$.

It is clear that

$$Y_1 + \dots + Y_n \sim N(n\mu, n\sigma^2)$$

and so

$$\bar{Y} \sim N\left( \mu, \frac{\sigma^2}n \right)$$

And then the following:

We show that $Y$ and $\sum_{i = 1}^n (Y_i - \bar{Y})^2$ are independent.

One can show that

$$\begin{align} \text{Cov}(\bar{Y}, Y_i - \bar{Y}) &= \dfrac{1}{n^2} \text{Cov} \left( \sum_{j = 1}^n Y_j, nY_i - \sum_{j = 1}^n Y_j \right) \\ &= \dfrac{1}{n^2} \left( (n - 1)\text{Var}(Y_i) - \sum_{j = 1, j \not= i}^n \text{Var}(Y_j) \right) \\ &= \dfrac{1}{n^2} ((n - 1) \sigma^2 - (n - 1)\sigma^2) \\ &= 0 \end{align}$$

Since $(\bar{Y}, Y_i - \bar{Y})$ is normally distributed and this implies $\bar{Y}$ and $Y_i - \bar{Y}$ are independent for all $i$. So $\bar{Y}$ and $(Y_1 - \bar{Y}, \dots, Y_n - \bar{Y})$ are also independent. This implies $\bar{Y}$ and $\sum_{i = 1}^n (Y_i - \bar{Y})^2$ are independent.

How is $\text{Cov}(\bar{Y}, Y_i - \bar{Y}) = \dfrac{1}{n^2} \text{Cov} \left( \sum_{j = 1}^n Y_j, nY_i - \sum_{j = 1}^n Y_j \right)$? It seems like it should be $\text{Cov}(\bar{Y}, Y_i - \bar{Y}) = \dfrac{1}{n} \text{Cov} \left( \sum_{j = 1}^n Y_j, nY_i - \sum_{j = 1}^n Y_j \right)$, no?

The Pointer
  • 1,932
  • 1
    Covariance is bilinear. In your "How is" formula, the right hand side has been multiplied once by $n$ (the first term in the covariance), multiplied again by $n$ (the second) term, and then divided by $n^2.$ Consequently, you are asking why $(n)(n)/(n^2)=1.$ Check it out for yourself with a simple calculation: use a dataset of two observations, for instance. – whuber Apr 07 '22 at 22:53
  • @whuber Is this the same property as $\text{Var}(aX) = a^2 \text{Var}(X)$? I suspected that it was the same property, but I couldn't find anything in my research that explicitly confirmed it. – The Pointer Apr 07 '22 at 22:55
  • 3
    Essentially the same: $\text{Cov}(aX,bY)=ab,\text{Cov}(X,Y)$ so $\text{Cov}(X/n,Y/n)=\frac1{n^2},\text{Cov}(X,Y)$ – Henry Apr 07 '22 at 23:43
  • See https://stats.stackexchange.com/a/142472/919 for explicit confirmation and this site search for more explanation of how variances and covariances are two aspects of the same thing. – whuber Apr 08 '22 at 14:32

1 Answers1

3

\begin{align} & \operatorname{Cov}(\bar{Y}, Y_i - \bar{Y}) \\[12pt] = {} & \operatorname{Cov} \left( \frac 1 n \sum_{j = 1}^n Y_j,\,\, \frac 1 n \left( nY_i - \sum_{j = 1}^n Y_j \right) \right) \\[12pt] = {} & \frac 1 n \operatorname{Cov} \left( \sum_{j = 1}^n Y_j,\,\, \frac 1 n \left( nY_i - \sum_{j = 1}^n Y_j \right) \right) \\[12pt] = {} & \frac 1 n\cdot\frac 1 n \operatorname{Cov} \left( \sum_{j = 1}^n Y_j,\,\, nY_i - \sum_{j = 1}^n Y_j \right) \end{align}