User Manual
ASUS, Inc
.
39
© 2021 ASUS, Inc. All rights reserved.
Precision is the ratio of true positive events predicted over
total sum of all predicted positive events.
Recall = TP / (TP+FN)
Recall is the ratio of true positive events predicted over
total sum of actual positive events.
Loss = FN / (TP+FN)
Loss is the ratio of FN events over total sum of actual
positive events.
Detection = TP / (TP+FN)
Detection is equal to Recall.
Overkill = FP / (FP+TN)
Overkill is the ratio of false positive events over sum of total
actual negative events.
Examples of Confusion Matrix
Let's say, there are a batch of examinees total of 100 pcs. The true event is that
92 are good and 8 are no-good with defect.
The examination event is as below:
1. 5pcs examined with defects, and they are defect parts (True Positive)
2. 90pcs examined good, and they are actually good parts (True Negative)
3. 3pcs examined with defects, and they are actually good parts (False
Positive)
4. 2pcs examined good, and they are defect parts (False Negative)"
Hence, the result as below
1. Accuracy = (TP+TN) / (TP+TN+FP+FN) = (5+90) / (5+90+3+2) = 95%,
meaning 95% of total examination events are accurate result.
2. Precision = TP / (TP+FP) = 5 / (5+3) = 62.5%, meaning 62.5% of total
examined defect events are actually defect events.
3. Recall or Detection = TP/ (TP+FN) = 5 / (5+2) = 71.4%, meaning 71.4% total
actual defect parts examined as defect events.
4. Loss = FN/ (TP+FN) = 2 / (5+2) = 28.6%, meaning 28.6% total defect parts
are not examined as defect events.
5. Overkill = FP/ (FP+TN) = 3 / (3+90) = 3.2%, meaning 3.2% of total good
parts are examined as defect events.