Get startedGet started for free

Bayesian Hyperparameter tuning with Hyperopt

In this example you will set up and run a Bayesian hyperparameter optimization process using the package Hyperopt (already imported as hp for you). You will set up the domain (which is similar to setting up the grid for a grid search), then set up the objective function. Finally, you will run the optimizer over 20 iterations.

You will need to set up the domain using values:

  • max_depth using quniform distribution (between 2 and 10, increasing by 2)
  • learning_rate using uniform distribution (0.001 to 0.9)

Note that for the purpose of this exercise, this process was reduced in data sample size and hyperopt & GBM iterations. If you are trying out this method by yourself on your own machine, try a larger search space, more trials, more cvs and a larger dataset size to really see this in action!

This exercise is part of the course

Hyperparameter Tuning in Python

View Course

Exercise instructions

  • Set up a space dictionary using the domain mentioned above.
  • Set up the objective function using a gradient boosting classifier.
  • Run the algorithm for 20 evaluations (just use the default, suggested algorithm from the slides).

Hands-on interactive exercise

Have a go at this exercise by completing this sample code.

# Set up space dictionary with specified hyperparameters
space = {'max_depth': hp.____('max_depth', ____, ____, ____),'learning_rate': hp.____('learning_rate', ____,____)}

# Set up objective function
def objective(params):
    params = {'max_depth': int(params[____]),'learning_rate': params[____]}
    gbm_clf = ____(n_estimators=100, **params) 
    best_score = cross_val_score(gbm_clf, X_train, y_train, scoring='accuracy', cv=2, n_jobs=4).mean()
    loss = 1 - ____
    return ____

# Run the algorithm
best = fmin(fn=____,space=space, max_evals=____, rstate=np.random.default_rng(42), algo=tpe.suggest)
print(____)
Edit and Run Code