Evaluating model quality
It's now time to begin evaluating model quality.
Here, you will compare the RMSE and MAE of a cross-validated XGBoost model on the Ames housing data. As in previous exercises, all necessary modules have been pre-loaded and the data is available in the DataFrame df.
Cet exercice fait partie du cours
Extreme Gradient Boosting with XGBoost
Exercice interactif pratique
Essayez cet exercice en complétant cet exemple de code.
# Create the DMatrix: housing_dmatrix
housing_dmatrix = xgb.DMatrix(data=X, label=y)
# Create the parameter dictionary: params
params = {"objective":"reg:squarederror", "max_depth":4}
# Perform cross-validation: cv_results
cv_results = ____(dtrain=____, params=____, nfold=____, num_boost_round=____, metrics=____, as_pandas=True, seed=123)
# Print cv_results
print(cv_results)
# Extract and print final boosting round metric
print((cv_results["____"]).tail(1))