Fine tune your model
Wow! That was a significant improvement over a regression model. Now let's see if you can further improve this performance by fine tuning your random forest models. To do this you will vary the mtry
parameter when building your random forest models on your train
data.
The default value of mtry
for ranger is the rounded down square root of the total number of features (6). This results in a value of 2.
Diese Übung ist Teil des Kurses
Machine Learning in the Tidyverse
Anleitung zur Übung
- Use
crossing()
to expand the cross validation data for values ofmtry
ranging from 2 through 5. - Build random forest models for each fold/mtry combination.
Interaktive Übung
Versuche dich an dieser Übung, indem du diesen Beispielcode vervollständigst.
# Prepare for tuning your cross validation folds by varying mtry
cv_tune <- cv_data %>%
crossing(mtry = ___:___)
# Build a model for each fold & mtry combination
cv_model_tunerf <- cv_tune %>%
mutate(model = map2(.x = ___, .y = ___, ~ranger(formula = life_expectancy~.,
data = .x, mtry = .y,
num.trees = 100, seed = 42)))