A first attempt at bagging
You've seen what happens in a single iteration of a bagging ensemble. Now let's build a custom bagging model!
Two functions have been prepared for you:
def build_decision_tree(X_train, y_train, random_state=None):
# Takes a sample with replacement,
# builds a "weak" decision tree,
# and fits it to the train set
def predict_voting(classifiers, X_test):
# Makes the individual predictions
# and then combines them using "Voting"
Technically, the build_decision_tree()
function is what you did in the previous exercise. Here, you will build multiple such trees and then combine them. Let's see if this ensemble of "weak" models improves performance!
Este ejercicio forma parte del curso
Ensemble Methods in Python
Instrucciones del ejercicio
- Build the individual models by calling
build_decision_tree()
, passing the training set and the indexi
as the random state. - Predict the labels of the test set using
predict_voting()
, with the list of classifiersclf_list
and the input test features.
Ejercicio interactivo práctico
Prueba este ejercicio y completa el código de muestra.
# Build the list of individual models
clf_list = []
for i in range(21):
weak_dt = ____
clf_list.append(weak_dt)
# Predict on the test set
pred = ____
# Print the F1 score
print('F1 score: {:.3f}'.format(f1_score(y_test, pred)))