Tuning max_depth
In this exercise, your job is to tune max_depth, which is the parameter that dictates the maximum depth that each tree in a boosting round can grow to. Smaller values will lead to shallower trees, and larger values to deeper trees.
Cet exercice fait partie du cours
Extreme Gradient Boosting with XGBoost
Instructions
- Create a list called 
max_depthsto store the following"max_depth"values:2,5,10, and20. - Iterate over your 
max_depthslist using aforloop. - Systematically vary 
"max_depth"in each iteration of theforloop and perform 2-fold cross-validation with early stopping (5rounds),10boosting rounds, a metric of"rmse", and aseedof123. Ensure the output is a DataFrame. 
Exercice interactif pratique
Essayez cet exercice en complétant cet exemple de code.
# Create your housing DMatrix
housing_dmatrix = xgb.DMatrix(data=X,label=y)
# Create the parameter dictionary
params = {"objective":"reg:squarederror"}
# Create list of max_depth values
max_depths = ____
best_rmse = []
# Systematically vary the max_depth
for curr_val in ____:
    params["____"] = ____
    
    # Perform cross-validation
    cv_results = ____
    
    
    
    # Append the final round rmse to best_rmse
    best_rmse.append(cv_results["test-rmse-mean"].tail().values[-1])
# Print the resultant DataFrame
print(pd.DataFrame(list(zip(max_depths, best_rmse)),columns=["max_depth","best_rmse"]))