1

I have that $X=(X_1,...,X_n)$ is a rv consisting of $n$ iid exponential rv's where $\theta$ is the parameter (and thus mean $1\over \theta$) .

I have to prove that $T(X)=\sum^n_{i=1}X_i$ is a sufficient statistic for $\theta$ using the theorem in my textbook which states that if $\frac{f_{\theta}(x)}{f^T_{\theta}(T(x))}$ is constant in $\theta$ then $T(X)$ is a sufficient statistic for ${\theta}$.

So what I've done is:

First I calculated $f_{\theta}(x)= \theta^n e^{-\theta\sum^n_{i=1}X_i}$ since $x=(x_1,...,x_n)$ are iid.

Then I calculated $f_{\theta}^T(T(x))= \frac{{\sum^n_{i=1}X_i}^{n-1}e^{\frac{\sum^n_{i=1}}{\theta}}}{\theta^n(n-1)!}$ since $T(X)\sim gamma(n,\theta)$

So now using the theorem with the ratio I get stuck because the $\theta$'s don't cancel out:

$$\frac{f_{\theta}(x)}{f^T_{\theta}(T(x))}=\frac{\theta^n e^{-\theta\sum^n_{i=1}X_i}}{\frac{{\sum^n_{i=1}X_i}^{n-1}e^{\frac{\sum^n_{i=1}}{\theta}}}{\theta^n(n-1)!}}$$ $$=$$ $$\frac{\theta^{2n} (n-1)!e^{\frac{1-\theta^2}{\theta}\sum^n_{i=1}x_i}}{{\sum^n_{i=1}}^{n-1}}$$

So by my calculation(probably which are wrong), $\theta$ is not cancelling out. Any mistakes that I have made that you could possibly see? Any help would be appreciated!

Here's the theorem in my textbook: enter image description here

  • What is your definition of $f_{\theta}^T(T(x))$? Anyway, the natural way to use the Factorisation Theorem is to show that the joint probability density of the sample is the product of a function $g$ of the statistic and the parameter $\theta$, and a function $h$ that does not depend on $\theta$, as is done without difficulty at the bottom of this page: https://online.stat.psu.edu/stat414/node/283/. – Mickybo Yakari Jan 20 '20 at 11:43
  • No, but I would not like to prove it using Fisher-Neymann factorisation theorem, but this way which is in the question –  Jan 20 '20 at 12:02
  • Ok. How do you define $f_{\theta}^{T}(T(x))$? I am puzzled by the superscript. – Mickybo Yakari Jan 20 '20 at 12:04
  • @MickyboYakari it is the pdf of $T(X)=\sum^n_{i=1}X_i \sim \Gamma(n,\theta)$ –  Jan 20 '20 at 12:08
  • All right. I'll write the answer then. – Mickybo Yakari Jan 20 '20 at 12:18
  • Cross-posted at https://math.stackexchange.com/q/3515790/321264. – StubbornAtom Jan 20 '20 at 13:11

1 Answers1

1

Because the individuals are sampled independently from an exponential distribution with parameter $\theta$, we have got $$ T(\boldsymbol{X})=\sum_{i=1}^{n} X_i \sim \Gamma(\alpha=n,\beta=\theta).$$ So, $$f_\theta^{T}(T(\boldsymbol{X}))=\theta^n \frac{1}{(n-1)!} (\sum_{i=1}^{n} X_i)^{n-1} e^{-\theta \sum_{i=1}^{n} X_i }$$ and $$ \frac{f_{\theta}(x)}{f_\theta^{T}(T(\boldsymbol{X}))} =\frac{ (n-1)!}{\sum_{i=1}^{n} X_i},$$ which is a constant function with respect to $\theta$.

  • oh, so my only mistake was taking $\beta$ for $1\over \theta$ . –  Jan 20 '20 at 13:15
  • One should always be wary of distributions which have more than one classical parametrisation. This is the reason I like the $\Gamma(\alpha=\alpha_0,\beta=\beta_0)$ notation. – Mickybo Yakari Jan 20 '20 at 13:22