A first attempt at bagging
You've seen what happens in a single iteration of a bagging ensemble. Now let's build a custom bagging model!
Two functions have been prepared for you:
def build_decision_tree(X_train, y_train, random_state=None):
# Takes a sample with replacement,
# builds a "weak" decision tree,
# and fits it to the train set
def predict_voting(classifiers, X_test):
# Makes the individual predictions
# and then combines them using "Voting"
Technically, the build_decision_tree() function is what you did in the previous exercise. Here, you will build multiple such trees and then combine them. Let's see if this ensemble of "weak" models improves performance!
Diese Übung ist Teil des Kurses
Ensemble Methods in Python
Anleitung zur Übung
- Build the individual models by calling
build_decision_tree(), passing the training set and the indexias the random state. - Predict the labels of the test set using
predict_voting(), with the list of classifiersclf_listand the input test features.
Interaktive Übung
Vervollständige den Beispielcode, um diese Übung erfolgreich abzuschließen.
# Build the list of individual models
clf_list = []
for i in range(21):
weak_dt = ____
clf_list.append(weak_dt)
# Predict on the test set
pred = ____
# Print the F1 score
print('F1 score: {:.3f}'.format(f1_score(y_test, pred)))