0

Warning

The question is the second part of this question. The third and last part is found here.

Exercise

Let $X \thicksim Pa(\lambda,\theta)$ with density function: $ f(x; \theta, \lambda) = \frac{\lambda \theta^{\lambda}}{x^{\lambda+1}} $ where $x \geq \theta$ and $\theta > 0$.

From the first question I have found the CDF: $ F(x) = 1 - \left( \frac{\theta}{x} \right)^{\lambda} $

Let $\lambda $ known. Find the estimator of $\theta$ with MLE and establish if is unbiased.

Try

I find the log-likelihood function and I calculate the first two derivatives: $$ L(\theta) \propto \frac{\lambda^n\theta^{n\lambda}}{\prod_{i=0}^{n} x^{\lambda+1}_i} = \lambda^n\theta^{n\lambda} \left( \prod_{i=0}^{n} x_i \right)^{-(\lambda+1)} $$

$$ l(\theta) = \log L(\theta) = n\log(\lambda) + n\lambda \log(\theta) - (\lambda+1)\sum^{n}_{i=0} \log(x_i) $$

Since $\lambda$ is known, then $$ L(\theta) \propto \theta^{\lambda} 1(x_{(1)} \geq \theta) $$

where $1(\cdot)$ is the indicator function. For $\lambda > 0$, the likelihood function is monotone increasing on the interval $\theta \in (0, x_{(1)}]$. Hence, the MLE of $\theta$ is just the minimum $x_{(1)}$.

I verify if is unbiased:

$$ E(\hat{\theta}) = \int^{+\infty}_{\theta} \hat{\theta} f(\theta) dx = \int^{+\infty}_{\theta} x \frac{n\lambda \theta^{n\lambda}}{x^{n\lambda+1}} dx = n\lambda \theta^{n\lambda} \int^{+\infty}_{\theta} x^{-n\lambda} dx $$

$$ = n\lambda \theta^{n\lambda} \left[ \frac{x^{-(n\lambda-1)} }{-n\lambda+1} \right]^{+\infty}_{\theta} = n\lambda \theta^{n\lambda} \left(0 - \frac{\theta^{-(n\lambda-1)} }{-n\lambda+1} \right) = n\lambda \theta^{n\lambda} \frac{\theta^{-n\lambda}\theta}{n\lambda-1} $$

Finally: $$ E(\hat{\theta}) = \theta \left( \frac{n\lambda }{n\lambda-1} \right) $$

It's biased due to the constant $n\lambda/(n\lambda-1)$, but is necessarily itself because $P(X_{(1)} < \theta) = 0$ that's the sample minimum can never be less than $\theta$.

For the calculation of the MLE and the verify of unbiasedness, I have followed this question.

If I have missed anything else, please report it to me immediately.

  • Your logic is hard to follow, because your equation "$\frac{n\lambda}{\theta}=0$" never has a solution. – whuber Jan 24 '23 at 17:48
  • @whuber I know about the equation $n\lambda/\theta = 0$, but I was following the MLE procedure. Should I fix the logic of the question? – iStats7238 Jan 24 '23 at 17:51
  • That's not the MLE procedure: it reflects a common misunderstanding of a Calculus technique to find an optimum. The procedure is, literally, to find the optimum. This includes more than examining the zeros of the derivative of the objective function. You also have to examine places where that function might fail to be differentiable, as well as evaluating it on the boundary of its domain. The point of this exercise is to help you appreciate the distinction and see an example where the optimum is attained on the boundary. – whuber Jan 24 '23 at 19:00
  • @whuber Hi. I have tried to adjust the logic of the procedure. Do you think it makes sense now? – iStats7238 Jan 31 '23 at 15:05
  • 1
    It makes sense. Notice that you don't even need to compute derivatives: you only need to justify your observation that the likelihood is decreasing. – whuber Jan 31 '23 at 16:25
  • @whuber Ah okay. I thought I had to calculate the derivatives. I will arrange to remove them. Regarding the decreasing of the likelihood: should I better specify the role of the denominator in the function $L(\theta)$? – iStats7238 Jan 31 '23 at 17:20
  • 2
    No, the $\propto$ takes care of that. (At least if I was grading it, it would.) – jbowman Jan 31 '23 at 17:43
  • @jbowman Perfect, I have solved the question then. Thank you both. – iStats7238 Jan 31 '23 at 17:54

0 Answers0