Aan de slagGa gratis aan de slag

Changing the number of hyperparameters to tune

When we examine the model object closely, we can see that caret already did some automatic hyperparameter tuning for us: train automatically creates a grid of tuning parameters. By default, if p is the number of tuning parameters, the grid size is 3^p. But we can also specify the number of different values to try for each hyperparameter.

The data has again been preloaded as bc_train_data. The libraries caret and tictoc have also been preloaded.

Deze oefening maakt deel uit van de cursus

Hyperparameter Tuning in R

Cursus bekijken

Oefeninstructies

  • Test four different values for each hyperparameter with automatic tuning in caret.

Praktische interactieve oefening

Probeer deze oefening eens door deze voorbeeldcode in te vullen.

# Set seed.
set.seed(42)
# Start timer.
tic()
# Train model.
gbm_model <- train(diagnosis ~ ., 
                   data = bc_train_data, 
                   method = "gbm", 
                   trControl = trainControl(method = "repeatedcv", number = 5, repeats = 3),
                   verbose = FALSE,
                   ___)
# Stop timer.
toc()
Code bewerken en uitvoeren