Cross-validation scoring
Now, you should use cross-validation scoring with cross_val_score()
to check the overall performance.
This is exercise presents an excellent opportunity to test out the use of the hyperparameters learning_rate
and max_depth
. Remember, hyperparameters are like settings which can help create optimum performance.
The data sets cr_loan_prep
, X_train
, and y_train
have already been loaded in the workspace.
This exercise is part of the course
Credit Risk Modeling in Python
Exercise instructions
- Create a gradient boosted tree with a learning rate of
0.1
and a max depth of7
. Store the model asgbt
. - Calculate the cross validation scores against the
X_train
andy_train
data sets with4
folds. Store the results ascv_scores
. - Print the cross validation scores.
- Print the average accuracy score and standard deviation with formatting.
Hands-on interactive exercise
Have a go at this exercise by completing this sample code.
# Create a gradient boosted tree model using two hyperparameters
____ = xgb.____(____ = ____, ____ = ____)
# Calculate the cross validation scores for 4 folds
____ = ____(____, ____, np.ravel(____), cv = ____)
# Print the cross validation scores
print(____)
# Print the average accuracy and standard deviation of the scores
print("Average accuracy: %0.2f (+/- %0.2f)" % (____.____(),
____.____() * 2))