Cross-validation scoring
Now, you should use cross-validation scoring with cross_val_score() to check the overall performance.
This is exercise presents an excellent opportunity to test out the use of the hyperparameters learning_rate and max_depth. Remember, hyperparameters are like settings which can help create optimum performance.
The data sets cr_loan_prep, X_train, and y_train have already been loaded in the workspace.
Diese Übung ist Teil des Kurses
Credit Risk Modeling in Python
Anleitung zur Übung
- Create a gradient boosted tree with a learning rate of
0.1and a max depth of7. Store the model asgbt. - Calculate the cross validation scores against the
X_trainandy_traindata sets with4folds. Store the results ascv_scores. - Print the cross validation scores.
- Print the average accuracy score and standard deviation with formatting.
Interaktive Übung
Vervollständige den Beispielcode, um diese Übung erfolgreich abzuschließen.
# Create a gradient boosted tree model using two hyperparameters
____ = xgb.____(____ = ____, ____ = ____)
# Calculate the cross validation scores for 4 folds
____ = ____(____, ____, np.ravel(____), cv = ____)
# Print the cross validation scores
print(____)
# Print the average accuracy and standard deviation of the scores
print("Average accuracy: %0.2f (+/- %0.2f)" % (____.____(),
____.____() * 2))