Making sense of the confusion matrix

Описание к видео Making sense of the confusion matrix

How do you interpret a confusion matrix? How can it help you to evaluate your machine learning model? What rates can you calculate from a confusion matrix, and what do they actually mean?

In this video, I'll start by explaining how to interpret a confusion matrix for a binary classifier:
0:49 What is a confusion matrix?
2:14 An example confusion matrix
5:13 Basic terminology

Then, I'll walk through the calculations for some common rates:
11:20 Accuracy
11:56 Misclassification Rate / Error Rate
13:20 True Positive Rate / Sensitivity / Recall
14:19 False Positive Rate
14:54 True Negative Rate / Specificity
15:58 Precision

Finally, I'll conclude with more advanced topics:
19:10 How to calculate precision and recall for multi-class problems
24:17 How to analyze a 10-class confusion matrix
28:26 How to choose the right evaluation metric for your problem
31:31 Why accuracy is often a misleading metric


== RELATED RESOURCES ==

My confusion matrix blog post:
https://www.dataschool.io/simple-guid...

Evaluating a classifier with scikit-learn (video):
   • How to evaluate a classifier in sciki...  

ROC curves and AUC explained (video):
   • ROC Curves and Area Under the Curve (...  


== DATA SCHOOL INSIDERS ==

Join "Data School Insiders" on Patreon for bonus content:
  / dataschool  


== WANT TO GET BETTER AT MACHINE LEARNING? ==

1) WATCH my scikit-learn video series:
   • Machine learning in Python with sciki...  

2) SUBSCRIBE for more videos:
https://www.youtube.com/dataschool?su...

3) ENROLL in my Machine Learning course:
https://www.dataschool.io/learn/

4) LET'S CONNECT!
Newsletter: https://www.dataschool.io/subscribe/
Twitter:   / justmarkham  
Facebook:   / datascienceschool  
LinkedIn:   / justmarkham  

Комментарии

Информация по комментариям в разработке