1

In my dataset, the target variable has three labels: Normal, suspect, and Pathology.

Using multinomial logistic regression, I calculated the weights for each predictive feature and got something like this:

enter image description here

When estimating the model, identification constraints are required on the parameters. These constraints do not influence the goodness of model fit, odds ratios, estimated probabilities, interpretations, or conclusions. Identification constraints do affect the specific values of parameter estimates. The typical constraints are to set the parameter values of the baseline category equal to zero (e.g., aj = $\beta_j$ = 0) or to set the sum of the parameters equal to zero (e.g., $\sum \beta = 0$).

Question: why does the sum of the coefficients equal zero? how does this constraint happen and why is it necessary?

CORy
  • 543
  • 1
    This sounds like a standard characteristic of any multinomial logistic regression--because it estimates odds ratios only the differences matter, so (depending on how you specify the model) the software will standardize its outputs to sum to zero--but it would be nice to have you confirm that's what you're doing. For further explanation follow the links from this site search. – whuber Dec 07 '22 at 16:19
  • @whuber Yes.. that is what I'm doing. – CORy Dec 07 '22 at 18:11
  • @whuber what do you mean by "because it estimates odds ratios only the differences matter" and "the software will standardize its outputs to sum to zero" ? – CORy Dec 07 '22 at 18:19
  • That's all explained in some of the hits in that site search. Check out the first one. – whuber Dec 07 '22 at 19:00

0 Answers0