Tree-based AdaBoost regression
AdaBoost models are usually built with decision trees as the base estimators. Let's give this a try now and see if model performance improves even further.
We'll use twelve estimators as before to have a fair comparison. There's no need to instantiate the decision tree as it is the base estimator by default.
This exercise is part of the course
Ensemble Methods in Python
Exercise instructions
- Build and fit an
AdaBoostRegressor
using12
estimators. You do not have to specify a base estimator. - Calculate the predictions on the test set.
Hands-on interactive exercise
Have a go at this exercise by completing this sample code.
# Build and fit a tree-based AdaBoost regressor
reg_ada = ____(____, random_state=500)
reg_ada.fit(X_train, y_train)
# Calculate the predictions on the test set
pred = ____
# Evaluate the performance using the RMSE
rmse = np.sqrt(mean_squared_error(y_test, pred))
print('RMSE: {:.3f}'.format(rmse))