1. Learn
  2. /
  3. Courses
  4. /
  5. Hyperparameter Tuning in R

Connected

Exercise

Grid search with h2o

Now that you successfully trained a Random Forest model with h2o, you can apply the same concepts to training all other algorithms, like Deep Learning. In this exercise, you are going to apply a grid search to tune a model.

Remember that gradient boosting models have the hyperparameter learn_rate whereas deep learning models have the rate hyperparameter.

The h2o library has already been loaded and initialized for you.

Instructions 1/4

undefined XP
    1
    2
    3
    4
  • Start defining a grid of hyperparameters for deep learning with h2o: for learning rate use the values 0.001, 0.005 and 0.01. For an overview of all hyperparameters to use, go to the help for h2o.deeplearning.