Questions tagged [confusion-matrix]

A confusion matrix is a contingency table used to evaluate the predictive accuracy of a classifier. Confusion matrix is the 2x2 frequency table with counts "True positive", "True negative", "False positive", "False negative", relating classifying to a class of interest vs. else class. But in a broader sense, any frequency kxk crosstabulation "Predicted" x "Actual" classes can be called a confusion matrix, in the context of evaluation of a classifier.

A confusion matrix is a special contingency table used to evaluate the predictive accuracy of a classifier. Predicted classes are listed in rows (or columns) & actual classes in columns (or rows), with counts of the cases in each combination listed in each cell. All cases represented along the main diagonal are accurately classified, while the off-diagonal elements are misclassified. Inspection of the confusion matrix can identify which classes tend to be 'confused' for each other. The confusion matrix also allows the calculation of model performance metrics such as sensitivity and specificity, precision and recall, positive and negative predictive value, etc.

Here is an example confusion matrix from a model of Fisher's iris data with good accuracy. The model occasionally confuses versicolor and virginica, but never misclassified either as setosa.

                      Actual:
             setosa versicolor virginica 
Predicted:
  setosa         50          0         0  
  versicolor      0         47         3  
  virginica       0          4        46 
292 questions
6
votes
5 answers

Are FAR and FRR the same as FPR and FNR, respectively?

FPR = False Positive Rate FNR = False Negative Rate FAR = False Acceptance Rate FRR = False Rejection Rate Are they the same? if Not, is it possible to calculate FAR and FRR from the confusion matrix? Thank you
Aizzaac
  • 1,179
5
votes
2 answers

How to make a confusion matrix from comparing prediction results of two algorithms?

I applied two unsupervised algorithms to the same data, and would like to make a confusion matrix out of results, how should I achieve it in R? An example with R codes like following: xx.1 <- c(41, 0, 4, 0, 0, 0, 0, 0, 0, 7, 0, 11, 8, 0, 0, 0, 0, 0,…
ccshao
  • 667
4
votes
2 answers

How to find true positive, true negative, false positive, false negative from a three class confusion matrix?

I built a confusion matrix of three class. like, (a,b,c are the class) - a b c <= predicted a 20 5 0 b 7 18 0 c 0 0 20 Now I want to calculate precision, recall from this confusion matrix. In order to do that I need to find out…
mmr
  • 176
2
votes
2 answers

Is there a version of accuracy weighted by prevalence?

I have been learning the basic terminology for how to think about binary tests involving medical tests. The basic terms are here in this table This is the confusion matrix. My issue is the following. Note, I assume $T = N + P$ is the size of the…
2
votes
1 answer

Is the F-score invariant to confusion matrix transposition?

Is the F-score invariant to confusion matrix transposition? In other words, if actual values are substituted with predicted values and vice-versa, would the F-score stay the same?
2
votes
1 answer

False Discovery Rate = FP / (TP + FP)?

In the context of diagnostic screening (Positive / Negative) for a given outcome, what is the correct term for FP / (TP + FP)? Wikipedia calls it the False Discovery Rate. However, when I look for more credible references, I'm unable to find any.…
2
votes
1 answer

How do you calculate cost of a confusion matrix with more than two classes

I have been looking online and couldn't find an answer to what I am looking for. I have a 3 class confusion matrix as well as its cost matrix. I know how to do it for two classes but for three I am unsure of how to apply this formula. Confusion…
1
vote
0 answers

How to measure sensitivity, speficity, PPV and NPV from confusion matrix for multiclass classification

I want to calculate sensitivity, specificity, PPV, and NPV from a confusion matrix for multiclass classification. My objective is to learn the basic concept of constructing a confusion matrix for multiclass classification and how to calculate…
SUZ
  • 11
1
vote
0 answers

Zero R algorithm and confusion matrix for equal distributed target variable

So I am facing a multinomial classification problem with 3 classes in the target variable. Each of these 3 classes has exactly 369 instances, now I know the zero-rule algorithm predicts the majority class in the target variable. However, how does it…
1
vote
0 answers

False Positives, False negatives, true positives, true negatives

So I'm just starting to learn some proper statistics and recently learned about FP, FN, TP, TN. I'm a little confused as to how that works. Firstly lets say I have a way to predict whether a variable X is either 1 or 0 considering 1 to be positive…