Exercise

Try different max depths

We always want to optimize our machine learning models to make the best predictions possible. We can do this by tuning hyperparameters, which are settings for our models. We will see in more detail how these are useful in future chapters, but for now think of them as knobs we can turn to tune our predictions to be as good as possible.

For regular decision trees, probably the most important hyperparameter is max_depth. This limits the number of splits in a decision tree. Let's find the best value of max_depth based on the R\(^2\) score of our model on the test set, which we can obtain using the score() method of our decision tree models.

Instructions

100 XP
  • Loop through the values 3, 5, and 10 for use as the max_depth parameter in our decision tree model.
  • Set the max_depth parameter in our DecisionTreeRegressor to be equal to d in each loop iteration.
  • Print the model's score on the train_features and train_targets.