Tuning other hyperparameters
The power of GridSearchCV really comes into play when you're tuning multiple hyperparameters, as then the algorithm tries out all possible combinations of hyperparameters to identify the best combination. Here, you'll tune the following random forest hyperparameters:
| Hyperparameter | Purpose |
|---|---|
| criterion | Quality of Split |
| max_features | Number of features for best split |
| max_depth | Max depth of tree |
| bootstrap | Whether Bootstrap samples are used |
The hyperparameter grid has been specified for you, along with a random forest classifier called clf.
Questo esercizio fa parte del corso
Marketing Analytics: Predicting Customer Churn in Python
Esercizio pratico interattivo
Prova a risolvere questo esercizio completando il codice di esempio.
# Import GridSearchCV
from sklearn.model_selection import GridSearchCV
# Create the hyperparameter grid
param_grid = {"max_depth": [3, None],
"max_features": [1, 3, 10],
"bootstrap": [True, False],
"criterion": ["gini", "entropy"]}
# Call GridSearchCV
grid_search = ____(___,___,cv=3)