2

I am attempting to show that the smallest order statistic T is minimally sufficient for the mean of a distribution when the variance is known. In particular, iid random variables $X_1,\ldots,X_n$ have a pdf that is given by $f(x;θ)=(1/\sigma)e^{(-(x-\theta)/\sigma)}$.

I know that the two necessary conditions for minimal sufficiency are: i.) sufficiency; and ii.) that the statistic cannot be further reduced to another sufficient statistic. After having shown that T is sufficient, I failed to prove the second condition.

Since the statistic is clearly minimally sufficient (I wouldn't have been told to prove it, otherwise), it must be true that $T$ cannot be reduced to another arbitrarily defined, sufficient statistic $T'$. In other words, $T$ must be a defined function of $T'$. However, if $T' = \sum_i X_i$, for example, i.e. $T'$ is sufficient, then $T$ would only be a defined function of $T'$ if $n=1$. Hence, the second condition of minimal sufficiency would not hold. I must be missing something...

Taylor
  • 20,630
David
  • 1,226
  • 2
    Given that the dimension of your proposed minimally sufficient statistic is (fill in the blank) ___, can it be REDUCED to another sufficient statistic, i.e., to one with smaller dimension? – jbowman May 28 '18 at 00:26
  • Clearly I misunderstood the definition of reduction. By your logic, it then follow that every one-dimensional, sufficient statistic is minimally sufficient, correct? – David May 28 '18 at 19:04
  • That's correct. It's all about the dimensionality of the sufficient statistic; once you've found a minimal sufficient statistic, any 1-1 transform of it is also a minimal sufficient statistic. – jbowman May 28 '18 at 19:18
  • @jbowman: I don't think that is true! (while it might be a good initial guess). An example: $X \sim \cal{U}(-\theta, \theta)$. Then clearly $X$ is sufficient for $\theta$, but so is $\mid X \mid$, so $X$ is not minimally sufficient. You could construct similar examples based on the $\cal{N}(0,\sigma^2)$ family. – kjetil b halvorsen Jul 12 '18 at 01:24
  • @kjetilbhalvorsen So, you're saying that $|X|$ is a reduction of $X$ because there are two $X$'s for every $|X|$? – David Jul 12 '18 at 01:42
  • Yes, I say so. So to prove minimal sufficiency more is needed than just pointing out the statistic is one-dim! – kjetil b halvorsen Jul 12 '18 at 01:43
  • By that logic, wouldn't $X^2$ also be a reduction? I don't think it is but could be wrong. – David Jul 12 '18 at 01:48
  • @kjetilbhalvorsen - you are quite right, and I should have remembered that there's more to "minimal" than "dimension". Just lazy thinking on my part, I suppose. – jbowman Jul 12 '18 at 20:55
  • @DavidS: Yes, $X^2$ would also be a reduction. After all, there is a one-one relation between $X^2 $ and $\mid X \mid $. – kjetil b halvorsen Jul 12 '18 at 21:16

1 Answers1

2

Some hints: The question about sufficiency is asked and answered before, for instance at UMVUE of two-parameter exponential family distribution For minimal sufficiency, use the same method as at Finding a minimal sufficient statistic and proving that it is incomplete, that is, look at the ratio of the likelihood functions for two different samples, and show it is independent of the parameter iff the two samples have the same value of the statistic $T$.

The following post explains why this works: Understanding a characterization of minimal sufficient statistics