12

I know how Naive Bayes work for classifying binary problems. I just need to know what are the standard way to apply NB on multi-class classification problems. Any idea please?

1 Answers1

22

Unlike some classifiers, multi-class labeling is trivial with Naive Bayes.

For each test example $i$, and each class $k$ you want to find: $$\arg \max_k P(\textrm{class}_k | \textrm{data}_i)$$

In other words, you compute the probability of each class label in the usual way, then pick the class with the largest probability.

naive
  • 1,039
  • 1
  • 10
  • 14
Matt Krause
  • 21,095
  • Thank you Matt. As you said, it is pretty straightforward. While I think this would not be the case with SVM for example. – Mohammadreza Mar 25 '15 at 03:30
  • My pleasure. For other methods, there are (many) ways of combining two-way classifiers (like SVMs) into a multi-class system. I think there has also been some work on extending SVMs to do this "natively." – Matt Krause Mar 25 '15 at 15:52
  • Hi, can anyone explain how to set a minimum threshold in such cases of naive bayes? – Syed M. Sannan Apr 14 '23 at 20:33
  • People often just pick the most probable class, regardless of how (un)likely it may be. I suppose you could refuse to classify something if none of the probabilities are large and return "none of the above" but the threshold for doing that is, AFAIK, entirely up to you. – Matt Krause Apr 22 '23 at 01:30