ComenzarEmpieza gratis

Fine tune your model

Wow! That was a significant improvement over a regression model. Now let's see if you can further improve this performance by fine tuning your random forest models. To do this you will vary the mtry parameter when building your random forest models on your train data.

The default value of mtry for ranger is the rounded down square root of the total number of features (6). This results in a value of 2.

Este ejercicio forma parte del curso

Machine Learning in the Tidyverse

Ver curso

Instrucciones del ejercicio

  • Use crossing() to expand the cross validation data for values of mtry ranging from 2 through 5.
  • Build random forest models for each fold/mtry combination.

Ejercicio interactivo práctico

Prueba este ejercicio y completa el código de muestra.

# Prepare for tuning your cross validation folds by varying mtry
cv_tune <- cv_data %>% 
  crossing(mtry = ___:___) 

# Build a model for each fold & mtry combination
cv_model_tunerf <- cv_tune %>% 
  mutate(model = map2(.x = ___, .y = ___, ~ranger(formula = life_expectancy~., 
                                           data = .x, mtry = .y, 
                                           num.trees = 100, seed = 42)))
Editar y ejecutar código