Exercise

Tuning bagging hyperparameters

While you can easily build a bagging classifier using the default parameters, it is highly recommended that you tune these in order to achieve optimal performance. Ideally, these should be optimized using K-fold cross-validation.

In this exercise, let's see if we can improve model performance by modifying the parameters of the bagging classifier.

Here we are also passing the parameter solver='liblinear' to LogisticRegression to reduce the computation time.

Instructions

100 XP
  • Build a bagging classifier with 20 base estimators, 10 maximum features, and 0.65 (65%) maximum samples (max_samples). Sample without replacement.
  • Use clf_bag to predict the labels of the test set, X_test.