Get startedGet started for free

Fine tune your model

Wow! That was a significant improvement over a regression model. Now let's see if you can further improve this performance by fine tuning your random forest models. To do this you will vary the mtry parameter when building your random forest models on your train data.

The default value of mtry for ranger is the rounded down square root of the total number of features (6). This results in a value of 2.

This exercise is part of the course

Machine Learning in the Tidyverse

View Course

Exercise instructions

  • Use crossing() to expand the cross validation data for values of mtry ranging from 2 through 5.
  • Build random forest models for each fold/mtry combination.

Hands-on interactive exercise

Have a go at this exercise by completing this sample code.

# Prepare for tuning your cross validation folds by varying mtry
cv_tune <- cv_data %>% 
  crossing(mtry = ___:___) 

# Build a model for each fold & mtry combination
cv_model_tunerf <- cv_tune %>% 
  mutate(model = map2(.x = ___, .y = ___, ~ranger(formula = life_expectancy~., 
                                           data = .x, mtry = .y, 
                                           num.trees = 100, seed = 42)))
Edit and Run Code