BaşlayınÜcretsiz Başlayın

Calculating the ROC/AUC score

While the Recall score is an important metric for measuring the accuracy of a classification algorithm, it puts too much weight on the number of False Negatives. On the other hand, Precision is concentrated on the number of False Positives.

The combination of those two results in the ROC curve allows us to measure both recall and precision. The area under the ROC curve is calculated as the AUC score.

In this exercise, you will calculate the ROC/AUC score for the initial model using the sklearn roc_auc_score() function.

The variables features_test and target_test are available in your workspace.

Bu egzersiz

HR Analytics: Predicting Employee Churn in Python

kursunun bir parçasıdır
Kursu Görüntüle

Egzersiz talimatları

  • Import the function to calculate ROC/AUC score.
  • Use the initial model to predict churn (based on the features of the test set).
  • Calculate ROC/AUC score by comparing target_test with the prediction.

Uygulamalı interaktif egzersiz

Bu örnek kodu tamamlayarak bu egzersizi bitirin.

# Import the function to calculate ROC/AUC score
from sklearn.____ import ____

# Use initial model to predict churn (based on features_test)
prediction = model.predict(____)

# Calculate ROC/AUC score by comparing target_test with the prediction
____(____, prediction)
Kodu Düzenle ve Çalıştır