Evaluating model quality
It's now time to begin evaluating model quality.
Here, you will compare the RMSE and MAE of a cross-validated XGBoost model on the Ames housing data. As in previous exercises, all necessary modules have been pre-loaded and the data is available in the DataFrame df
.
This exercise is part of the course
Extreme Gradient Boosting with XGBoost
Hands-on interactive exercise
Have a go at this exercise by completing this sample code.
# Create the DMatrix: housing_dmatrix
housing_dmatrix = xgb.DMatrix(data=X, label=y)
# Create the parameter dictionary: params
params = {"objective":"reg:squarederror", "max_depth":4}
# Perform cross-validation: cv_results
cv_results = ____(dtrain=____, params=____, nfold=____, num_boost_round=____, metrics=____, as_pandas=True, seed=123)
# Print cv_results
print(cv_results)
# Extract and print final boosting round metric
print((cv_results["____"]).tail(1))