Get startedGet started for free

Grid search with h2o

Now that you successfully trained a Random Forest model with h2o, you can apply the same concepts to training all other algorithms, like Deep Learning. In this exercise, you are going to apply a grid search to tune a model.

Remember that gradient boosting models have the hyperparameter learn_rate whereas deep learning models have the rate hyperparameter.

The h2o library has already been loaded and initialized for you.

This exercise is part of the course

Hyperparameter Tuning in R

View Course

Hands-on interactive exercise

Have a go at this exercise by completing this sample code.

# Define hyperparameters
dl_params <- ___(___ = c(0.001, 0.005, 0.01))
Edit and Run Code