1. Learn
  2. /
  3. Courses
  4. /
  5. Hyperparameter Tuning in Python

Exercise

Bayesian Hyperparameter tuning with Hyperopt

In this example you will set up and run a Bayesian hyperparameter optimization process using the package Hyperopt (already imported as hp for you). You will set up the domain (which is similar to setting up the grid for a grid search), then set up the objective function. Finally, you will run the optimizer over 20 iterations.

You will need to set up the domain using values:

  • max_depth using quniform distribution (between 2 and 10, increasing by 2)
  • learning_rate using uniform distribution (0.001 to 0.9)

Note that for the purpose of this exercise, this process was reduced in data sample size and hyperopt & GBM iterations. If you are trying out this method by yourself on your own machine, try a larger search space, more trials, more cvs and a larger dataset size to really see this in action!

Instructions

100 XP
  • Set up a space dictionary using the domain mentioned above.
  • Set up the objective function using a gradient boosting classifier.
  • Run the algorithm for 20 evaluations (just use the default, suggested algorithm from the slides).