Aan de slagGa gratis aan de slag

Visually scoring credit models

Now, you want to visualize the performance of the model. In ROC charts, the X and Y axes are two metrics you've already looked at: the false positive rate (fall-out), and the true positive rate (sensitivity).

You can create a ROC chart of it's performance with the following code:

fallout, sensitivity, thresholds = roc_curve(y_test, prob_default)
plt.plot(fallout, sensitivity)

To calculate the AUC score, you use roc_auc_score().

The credit data cr_loan_prep along with the data sets X_test and y_test have all been loaded into the workspace. A trained LogisticRegression() model named clf_logistic has also been loaded into the workspace.

Deze oefening maakt deel uit van de cursus

Credit Risk Modeling in Python

Cursus bekijken

Oefeninstructies

  • Create a set of predictions for probability of default and store them in preds.
  • Print the accuracy score the model on the X and y test sets.
  • Use roc_curve() on the test data and probabilities of default to create fallout and sensitivity Then, create a ROC curve plot with fallout on the x-axis.
  • Compute the AUC of the model using test data and probabilities of default and store it in auc.

Praktische interactieve oefening

Probeer deze oefening eens door deze voorbeeldcode in te vullen.

# Create predictions and store them in a variable
____ = clf_logistic.____(____)

# Print the accuracy score the model
print(clf_logistic.____(____, ____))

# Plot the ROC curve of the probabilities of default
prob_default = preds[:, 1]
fallout, sensitivity, thresholds = ____(____, ____)
plt.plot(fallout, sensitivity, color = 'darkorange')
plt.plot([0, 1], [0, 1], linestyle='--')
plt.____()

# Compute the AUC and store it in a variable
____ = ____(____, ____)
Code bewerken en uitvoeren