14

I have some trouble understanding complete sufficient statistics?

Let $T=\Sigma x_i$ be a sufficient statistic.

If $E[g(T)]=0$ with probability 1, for some function $g$, then it is a complete sufficient statistic.

But what does this mean? I've seen examples of uniform and Bernoulli (page 6 http://amath.colorado.edu/courses/4520/2011fall/HandOuts/umvue.pdf), but it's not intuitive, I got more confused seeing the integration.

Could someone explain in a simple and intuitive way?

user13985
  • 946
  • 4
  • 12
  • 21

1 Answers1

16

Essentially, it means that no non-trivial function of the statistic has constant mean value.

This may not be very enlighthening in itself. Perhaps one way of looking at the utility of such notion is in connection with the theorem of Lehmann-Scheffé (Cox-Hinkley, Theoretical Statistics, p. 31): "In general, if a sufficient statistic is boundedly complete it is minimal sufficient. The converse is false."

Intuitively, if a function of $T$ has mean value not dependent on $\theta$, that mean value is not informative about $\theta$ and we could get rid of it to obtain a sufficient statistic "simpler". If it is boundedly complete ans sufficient, no such "simplification" is possible.

F. Tusell
  • 8,608
  • 23
  • 36