I have below an example I pulled from sklearn 's sklearn.metrics.classification_report documentation.
What I don't understand is why there are f1-score, precision and recall values for each class where I believe class is the predictor label? I thought the f1 score tells you the overall accuracy of the model. Also, what does the support column tell us? I couldn't find any info on that.
print(classification_report(y_true, y_pred, target_names=target_names))
precision recall f1-score support
class 0 0.50 1.00 0.67 1
class 1 0.00 0.00 0.00 1
class 2 1.00 0.67 0.80 3
avg / total 0.70 0.60 0.61 5
avg / total? It does not seem to match the column means... How is it computed and what does it mean? – Antoine Oct 19 '16 at 08:03(0.50*1 + 0.0*1 + 1.0*3)/5 = 0.70. The total is just for total support which is 5 here. – Nitin May 21 '17 at 03:56