ROC AUC
The ROC curve you plotted in the last exercise looked promising.
Now you will compute the area under the ROC curve, along with the other classification metrics you have used previously.
The confusion_matrix
and classification_report
functions have been preloaded for you, along with the logreg
model you previously built, plus X_train
, X_test
, y_train
, y_test
. Also, the model's predicted test set labels are stored as y_pred
, and probabilities of test set observations belonging to the positive class stored as y_pred_probs
.
A knn
model has also been created and the performance metrics printed in the console, so you can compare the roc_auc_score
, confusion_matrix
, and classification_report
between the two models.
Questo esercizio fa parte del corso
Supervised Learning with scikit-learn
Istruzioni dell'esercizio
- Import
roc_auc_score
. - Calculate and print the ROC AUC score, passing the test labels and the predicted positive class probabilities.
- Calculate and print the confusion matrix.
- Call
classification_report()
.
Esercizio pratico interattivo
Prova questo esercizio completando il codice di esempio.
# Import roc_auc_score
____
# Calculate roc_auc_score
print(____(____, ____))
# Calculate the confusion matrix
print(____(____, ____))
# Calculate the classification report
print(____(____, ____))