Get startedGet started for free

A first attempt at bagging

You've seen what happens in a single iteration of a bagging ensemble. Now let's build a custom bagging model!

Two functions have been prepared for you:

def build_decision_tree(X_train, y_train, random_state=None):
    # Takes a sample with replacement,
    # builds a "weak" decision tree,
    # and fits it to the train set

def predict_voting(classifiers, X_test):
    # Makes the individual predictions 
    # and then combines them using "Voting"

Technically, the build_decision_tree() function is what you did in the previous exercise. Here, you will build multiple such trees and then combine them. Let's see if this ensemble of "weak" models improves performance!

This exercise is part of the course

Ensemble Methods in Python

View Course

Exercise instructions

  • Build the individual models by calling build_decision_tree(), passing the training set and the index i as the random state.
  • Predict the labels of the test set using predict_voting(), with the list of classifiers clf_list and the input test features.

Hands-on interactive exercise

Have a go at this exercise by completing this sample code.

# Build the list of individual models
clf_list = []
for i in range(21):
	weak_dt = ____
	clf_list.append(weak_dt)

# Predict on the test set
pred = ____

# Print the F1 score
print('F1 score: {:.3f}'.format(f1_score(y_test, pred)))
Edit and Run Code