Tuning preparation
Tuning preparation is the foundation for tuning success. There are two main steps in preparing your tuning: marking hyperparameters using tune() in the model specification and creating a grid of hyperparameters that is used in tuning.
You are going to execute these two fundamental steps of the tuning process in this exercise.
Bu egzersiz
Machine Learning with Tree-Based Models in R
kursunun bir parçasıdırEgzersiz talimatları
- Create a boosting specification with an
"xgboost"engine for a classification model using 500 trees and mark the following parameters as tuning parameters:learn_rate,tree_depth, andsample_size. Save the result asboost_spec. - Build a regular tuning grid for the tuning parameters of
boost_specwith three levels for each parameter.
Uygulamalı interaktif egzersiz
Bu örnek kodu tamamlayarak bu egzersizi bitirin.
# Create the specification with placeholders
boost_spec <- boost_tree(
trees = ___,
___,
___,
___) %>%
set_mode(___) %>%
set_engine(___)
# Create the tuning grid
tunegrid_boost <- ___(___,
levels = ___)
tunegrid_boost