Grid search with h2o
Now that you successfully trained a Random Forest model with h2o, you can apply the same concepts to training all other algorithms, like Deep Learning. In this exercise, you are going to apply a grid search to tune a model.
Remember that gradient boosting models have the hyperparameter learn_rate whereas deep learning models have the rate hyperparameter.
The h2o library has already been loaded and initialized for you.
Deze oefening maakt deel uit van de cursus
Hyperparameter Tuning in R
Praktische interactieve oefening
Probeer deze oefening eens door deze voorbeeldcode in te vullen.
# Define hyperparameters
dl_params <- ___(___ = c(0.001, 0.005, 0.01))