1. Learn
  2. /
  3. Courses
  4. /
  5. Ensemble Methods in Python

Connected

Exercise

Tuning bagging hyperparameters

While you can easily build a bagging classifier using the default parameters, it is highly recommended that you tune these in order to achieve optimal performance. Ideally, these should be optimized using K-fold cross-validation.

In this exercise, let's see if we can improve model performance by modifying the parameters of the bagging classifier.

Here we are also passing the parameter solver='liblinear' to LogisticRegression to reduce the computation time.

Instructions

100 XP
  • Build a bagging classifier using as base the logistic regression, with 20 base estimators, 10 maximum features, 0.65 (65%) maximum samples (max_samples), and sample without replacement.
  • Use clf_bag to predict the labels of the test set, X_test.