Get startedGet started for free

Grid search CV for model complexity

In the last slide, you saw how most classifiers have one or more hyperparameters that control its complexity. You also learned to tune them using GridSearchCV(). In this exercise, you will perfect this skill. You will experiment with:

  • The number of trees, n_estimators, in a RandomForestClassifier.
  • The maximum depth, max_depth, of the decision trees used in an AdaBoostClassifier.
  • The number of nearest neighbors, n_neighbors, in KNeighborsClassifier.

This exercise is part of the course

Designing Machine Learning Workflows in Python

View Course

Hands-on interactive exercise

Have a go at this exercise by completing this sample code.

# Set a range for n_estimators from 10 to 40 in steps of 10
param_grid = {'____': range(10, ____, ____)}

# Optimize for a RandomForestClassifier() using GridSearchCV
grid = GridSearchCV(____, param_grid, cv=3)
grid.fit(X, y)
grid.best_params_
Edit and Run Code