CommencerCommencer gratuitement

Fine tune your model

Wow! That was a significant improvement over a regression model. Now let's see if you can further improve this performance by fine tuning your random forest models. To do this you will vary the mtry parameter when building your random forest models on your train data.

The default value of mtry for ranger is the rounded down square root of the total number of features (6). This results in a value of 2.

Cet exercice fait partie du cours

Machine Learning in the Tidyverse

Afficher le cours

Instructions

  • Use crossing() to expand the cross validation data for values of mtry ranging from 2 through 5.
  • Build random forest models for each fold/mtry combination.

Exercice interactif pratique

Essayez cet exercice en complétant cet exemple de code.

# Prepare for tuning your cross validation folds by varying mtry
cv_tune <- cv_data %>% 
  crossing(mtry = ___:___) 

# Build a model for each fold & mtry combination
cv_model_tunerf <- cv_tune %>% 
  mutate(model = map2(.x = ___, .y = ___, ~ranger(formula = life_expectancy~., 
                                           data = .x, mtry = .y, 
                                           num.trees = 100, seed = 42)))
Modifier et exécuter le code