Exercise

Calculating the ROC/AUC score

While the Recall score is an important metric for measuring the accuracy of a classification algorithm, it puts too much weight on the number of False Negatives. On the other hand, Precision is concentrated on the number of False Positives.

The combination of those two results in the ROC curve allows us to measure both recall and precision. The area under the ROC curve is calculated as the AUC score.

In this exercise, you will calculate the ROC/AUC score for the initial model using the sklearn roc_auc_score() function.

The variables features_test and target_test are available in your workspace.

Instructions

100 XP
  • Import the function to calculate ROC/AUC score.
  • Use the initial model to predict churn (based on the features of the test set).
  • Calculate ROC/AUC score by comparing target_test with the prediction.