1

In the textbook All of Statistics: A Concise Course in Statistical Inference by Larry Wasserman, the definition of minimal sufficient is given as follows:

9.35 Definition. A statistic $T$ is minimal sufficient if (i) it is sufficient; and (ii) it is a function of every other sufficient statistic.

It then gives the following theorem:

9.36 Theorem. $T$ is minimal sufficient if the following is true: $$T(x^n) = T(y^n) \ \text{if and only if} \ x^n \leftrightarrow y^n.$$

Why does this theorem have these exponents of $n$? Having these exponents of $n$ doesn't really make sense to me.

The Pointer
  • 1,932

1 Answers1

2

In that book, $n$ random variables $X_1, \dots, X_n$, constituting the "data set", are represented as $X^n$. Whereas realisations of those $n$ random variables $x_1, x_2, \dots , x_n$ are represented as $x^n$.

There is no exponentiation occurring.

microhaus
  • 2,505
  • Ahh, that makes sense. I'm jumping through the textbook to parts of interest, rather than reading from the beginning, so I missed that. – The Pointer Apr 20 '21 at 01:05
  • 1
    That book can be quite terse, but appropriate if you require stats knowledge only as a pre-requisite for machine learning work. He has an accompanying publically available Youtube course here: https://www.youtube.com/playlist?list=PLJPW8OTey_OZk6K_9QLpguoPg_Ip3GkW_

    And lecture notes on his CMU course page.

    – microhaus Apr 20 '21 at 01:12