Tuning preparation
Tuning preparation is the foundation for tuning success. There are two main steps in preparing your tuning: marking hyperparameters using tune() in the model specification and creating a grid of hyperparameters that is used in tuning.
You are going to execute these two fundamental steps of the tuning process in this exercise.
Latihan ini adalah bagian dari kursus
Machine Learning with Tree-Based Models in R
Petunjuk latihan
- Create a boosting specification with an
"xgboost"engine for a classification model using 500 trees and mark the following parameters as tuning parameters:learn_rate,tree_depth, andsample_size. Save the result asboost_spec. - Build a regular tuning grid for the tuning parameters of
boost_specwith three levels for each parameter.
Latihan interaktif praktis
Cobalah latihan ini dengan menyelesaikan kode contoh berikut.
# Create the specification with placeholders
boost_spec <- boost_tree(
trees = ___,
___,
___,
___) %>%
set_mode(___) %>%
set_engine(___)
# Create the tuning grid
tunegrid_boost <- ___(___,
levels = ___)
tunegrid_boost