0

From what I understand, maximum likelihood is used to estimate a parameter alpha in a way that maximizes the probability P(Y=|x,alpha) for example. It is used for logistic regression in order to get better estimation. I'm using weka and I read that, without the user making an explicit choice, the maximum likelihood algorithm is used.

Further reading led me to the subject of goodness-of-fit and more precisely Hosmer–Lemeshow test, and it was defined as a method to estimate "the best fit value of unknown parameters".

So here are my questions: are both maximum likelihood and goodness-of-fir tests meant for the same thing? if not what is the difference between the two? if they are meant for the same reason can maximum likelihood be considered a subfamily of the goodness-of-fit tests?

  • 1
    Goodness of fit tests are tests to examine the fit of an already fitted model, not a method for parameter estimation. – gammer Apr 27 '17 at 13:06
  • @gammer so goodness of fit are used after training+ testing the model? Because you said "to examine the fit of an already fitted model", an already fitted model is a model that's trained and tested right? – engineering student Apr 27 '17 at 13:11
  • 2
    Yes, that is correct. – Matthew Drury Apr 27 '17 at 14:24
  • You should also note that the Hosmer-Lemeshow test is now considered obsolete https://stats.stackexchange.com/questions/35422/validation-of-logistic-regression-goodness-of-fit-pearson/35427#35427 – kjetil b halvorsen Mar 29 '18 at 14:01

0 Answers0