Try different max depths
We always want to optimize our machine learning models to make the best predictions possible. We can do this by tuning hyperparameters, which are settings for our models. We will see in more detail how these are useful in future chapters, but for now think of them as knobs we can turn to tune our predictions to be as good as possible.
For regular decision trees, probably the most important hyperparameter is max_depth
. This limits the number of splits in a decision tree. Let's find the best value of max_depth
based on the R\(^2\) score of our model on the test set, which we can obtain using the score()
method of our decision tree models.
Este ejercicio forma parte del curso
Machine Learning for Finance in Python
Instrucciones del ejercicio
- Loop through the values 3, 5, and 10 for use as the
max_depth
parameter in our decision tree model. - Set the
max_depth
parameter in our DecisionTreeRegressor to be equal tod
in each loop iteration. - Print the model's score on the
train_features
andtrain_targets
.
Ejercicio interactivo práctico
Prueba este ejercicio y completa el código de muestra.
# Loop through a few different max depths and check the performance
for d in [____]:
# Create the tree and fit it
decision_tree = DecisionTreeRegressor(____)
decision_tree.fit(train_features, train_targets)
# Print out the scores on train and test
print('max_depth=', str(d))
print(decision_tree.score(____))
print(decision_tree.score(test_features, test_targets), '\n')