Get startedGet started for free

Randomly Sample Hyperparameters

To undertake a random search, we firstly need to undertake a random sampling of our hyperparameter space.

In this exercise, you will firstly create some lists of hyperparameters that can be zipped up to a list of lists. Then you will randomly sample hyperparameter combinations in preparation for running a random search.

You will use just the hyperparameters learning_rate and min_samples_leaf of the GBM algorithm to keep the example illustrative and not overly complicated.

This exercise is part of the course

Hyperparameter Tuning in Python

View Course

Exercise instructions

  • Create a list of 200 values for the learning_rate hyperparameter between 0.01 and 1.5 and assign to the list learn_rate_list.
  • Create a list of values between 10 and 40 inclusive for the hyperparameter min_samples_leaf and assign to the list min_samples_list.
  • Combine these lists into a list of lists to sample from.
  • Randomly sample 250 models from these hyperparameter combinations and print the result.

Hands-on interactive exercise

Have a go at this exercise by completing this sample code.

# Create a list of values for the learning_rate hyperparameter
learn_rate_list = list(np.____(____,____,____))

# Create a list of values for the min_samples_leaf hyperparameter
min_samples_list = list(____(____,____))

# Combination list
combinations_list = [list(x) for x in ____(____, min_samples_list)]

# Sample hyperparameter combinations for a random search.
random_combinations_index = np.____(range(0, len(____)), ____, replace=False)
combinations_random_chosen = [combinations_list[x] for x in ____]

# Print the result
print(____)
Edit and Run Code