Grid search CV for model complexity
In the last slide, you saw how most classifiers have one or more hyperparameters that control its complexity. You also learned to tune them using GridSearchCV()
. In this exercise, you will perfect this skill. You will experiment with:
- The number of trees,
n_estimators
, in aRandomForestClassifier
. - The maximum depth,
max_depth
, of the decision trees used in anAdaBoostClassifier
. - The number of nearest neighbors,
n_neighbors
, inKNeighborsClassifier
.
Este ejercicio forma parte del curso
Designing Machine Learning Workflows in Python
Ejercicio interactivo práctico
Prueba este ejercicio completando el código de muestra.
# Set a range for n_estimators from 10 to 40 in steps of 10
param_grid = {'____': range(10, ____, ____)}
# Optimize for a RandomForestClassifier() using GridSearchCV
grid = GridSearchCV(____, param_grid, cv=3)
grid.fit(X, y)
grid.best_params_