0

Here is my understanding of these topics:

  • MLE is better when you know about the type of distribution that generated data (ex: formula for mean of normal distribution is different from mean of binomial distribution which is different of mean of negative binomial distribution ..... and all variance formulas are also different)

  • Moments/OLS is better when you do not know which distribution generated data (ex: if you collect data irregardless of which distributions it came from, you can take the basic mean .... and this formula will never change: $\mu = \frac{1}{n} \sum_{i=1}^{n} x_i$ . Moments are less sensitive to wrong choice and outliers.

I am confused how to objectively compare advantages of MLE vs Moments/OLS. MLE estimator has minimum variance when distribution class is correct, consistent, normal for large sample size and sometimes unbiased ... but I think same also is for Moment/OLS (ex: BLUE best linear unbiased estimator)

This is very confusing because everything seems similar and contradicts. I am looking for a rule to use to help myself in analysis, ex: When you know distribution use MLE because of x,y,z reasons ....and when you dont know distribution, use OLS/Moments because of a,b,c reasons.

What exactly are the advantages? How do I calculate the risk-payoff? Ex: in some situation where I am not confident about the distribution class.... the possible gain in some performance indicator from using MLE/OLS = a and the possible loss from using MLE/OLS = b ...the possible gain in same performance indicator from using OLS/MLE = a^-1 and the possible loss from using OLS/MLE = b^-1.

For example, performance indicators could be: smaller bias, smaller variance, smaller sample size needed for same performance, etc

Can this be done?

stats_noob
  • 1
  • 3
  • 32
  • 105

0 Answers0