BaşlayınÜcretsiz Başlayın

Tuning preparation

Tuning preparation is the foundation for tuning success. There are two main steps in preparing your tuning: marking hyperparameters using tune() in the model specification and creating a grid of hyperparameters that is used in tuning.

You are going to execute these two fundamental steps of the tuning process in this exercise.

Bu egzersiz

Machine Learning with Tree-Based Models in R

kursunun bir parçasıdır
Kursu Görüntüle

Egzersiz talimatları

  • Create a boosting specification with an "xgboost" engine for a classification model using 500 trees and mark the following parameters as tuning parameters: learn_rate, tree_depth, and sample_size. Save the result as boost_spec.
  • Build a regular tuning grid for the tuning parameters of boost_spec with three levels for each parameter.

Uygulamalı interaktif egzersiz

Bu örnek kodu tamamlayarak bu egzersizi bitirin.

# Create the specification with placeholders
boost_spec <- boost_tree(
                trees = ___,
                ___,
                ___,
                ___) %>%
  set_mode(___) %>%
  set_engine(___)

# Create the tuning grid
tunegrid_boost <- ___(___, 
                      levels = ___)

tunegrid_boost
Kodu Düzenle ve Çalıştır