MLP Grid Search
Hyperparameter tuning can be done by sklearn
through providing various input parameters, each of which can be encoded using various functions from numpy
. One method of tuning, which exhaustively looks at all combinations of input hyperparameters specified via param_grid
, is grid search. In this exercise, you will use grid search to look over the hyperparameters for a MLP classifier.
X_train
, y_train
, X_test
, y_test
are available in your workspace, and the features have already been standardized. pandas
as pd
, numpy
as np
, are also available in your workspace.
This exercise is part of the course
Predicting CTR with Machine Learning in Python
Exercise instructions
- Create the list of values
[10, 20]
formax_iter
, and a list of values[(8, ), (16, )]
forhidden_layer_sizes
. - Set up a grid search with 4 jobs using
n_jobs
to iterate over all hyperparameter combinations. - Print out the best AUC score, and the best estimator that led to this score.
Hands-on interactive exercise
Have a go at this exercise by completing this sample code.
# Create list of hyperparameters
max_iter = [____, ____]
hidden_layer_sizes = [____, ____]
param_grid = {'max_iter': max_iter, 'hidden_layer_sizes': hidden_layer_sizes}
# Use Grid search CV to find best parameters using 4 jobs
mlp = ____
clf = ____(estimator = mlp, param_grid = ____,
scoring = 'roc_auc', ____ = 4)
clf.fit(X_train, y_train)
print("Best Score: ")
print(clf.____)
print("Best Estimator: ")
print(clf.____)