Varying hyperparameters
The number of iterations of training, and the size of hidden layers are two primary hyperparameters that can be varied when working with a MLP classifier. In this exercise, you will vary both separately and note how performance in terms of accuracy and AUC of the ROC curve may vary.
X_train
, y_train
, X_test
, y_test
are available in your workspace. Features have already been standardized using a StandardScaler()
. pandas
as pd
, numpy
as np
are also available in your workspace.
Este exercício faz parte do curso
Predicting CTR with Machine Learning in Python
Exercício interativo prático
Experimente este exercício completando este código de exemplo.
# Loop over various max_iter configurations
max_iter_list = [10, 20, 30]
for max_iter in ____:
clf = MLPClassifier(hidden_layer_sizes = (4, ),
____ = max_iter, random_state = 0)
# Extract relevant predictions
y_score = clf.fit(____, ____).____(X_test)
y_pred = clf.fit(____, ____).____(X_test)
# Get ROC curve metrics
print("Accuracy for max_iter = %s: %s" %(
max_iter, _____(y_test, ____)))
print("AUC for max_iter = %s: %s" %(
max_iter, ____(y_test, ____[:, 1])))