Get startedGet started for free

ROC AUC

The ROC curve you plotted in the last exercise looked promising.

Now you will compute the area under the ROC curve, along with the other classification metrics you have used previously.

The confusion_matrix and classification_report functions have been preloaded for you, along with the logreg model you previously built, plus X_train, X_test, y_train, y_test. Also, the model's predicted test set labels are stored as y_pred, and probabilities of test set observations belonging to the positive class stored as y_pred_probs.

A knn model has also been created and the performance metrics printed in the console, so you can compare the roc_auc_score, confusion_matrix, and classification_report between the two models.

This exercise is part of the course

Supervised Learning with scikit-learn

View Course

Exercise instructions

  • Import roc_auc_score.
  • Calculate and print the ROC AUC score, passing the test labels and the predicted positive class probabilities.
  • Calculate and print the confusion matrix.
  • Call classification_report().

Hands-on interactive exercise

Have a go at this exercise by completing this sample code.

# Import roc_auc_score
____

# Calculate roc_auc_score
print(____(____, ____))

# Calculate the confusion matrix
print(____(____, ____))

# Calculate the classification report
print(____(____, ____))
Edit and Run Code