Grid search with h2o
Now that you successfully trained a Random Forest model with h2o, you can apply the same concepts to training all other algorithms, like Deep Learning. In this exercise, you are going to apply a grid search to tune a model.
Remember that gradient boosting models have the hyperparameter learn_rate whereas deep learning models have the rate hyperparameter.
The h2o library has already been loaded and initialized for you.
Questo esercizio fa parte del corso
Hyperparameter Tuning in R
Esercizio pratico interattivo
Prova a risolvere questo esercizio completando il codice di esempio.
# Define hyperparameters
dl_params <- ___(___ = c(0.001, 0.005, 0.01))