Jointly tuning gamma and C with GridSearchCV
In the previous exercise the best value of gamma
was 0.001 using the default value of C
, which is 1. In this exercise you'll search for the best combination of C
and gamma
using GridSearchCV
.
As in the previous exercise, the 2-vs-not-2 digits dataset is already loaded, but this time it's split into the variables X_train
, y_train
, X_test
, and y_test
. Even though cross-validation already splits the training set into parts, it's often a good idea to hold out a separate test set to make sure the cross-validation results are sensible.
This exercise is part of the course
Linear Classifiers in Python
Exercise instructions
- Run
GridSearchCV
to find the best hyperparameters using the training set. - Print the best values of the parameters.
- Print out the accuracy on the test set, which was not used during the cross-validation procedure.
Hands-on interactive exercise
Have a go at this exercise by completing this sample code.
# Instantiate an RBF SVM
svm = SVC()
# Instantiate the GridSearchCV object and run the search
parameters = {'C':[0.1, 1, 10], 'gamma':[0.00001, 0.0001, 0.001, 0.01, 0.1]}
searcher = GridSearchCV(svm, ____)
____.fit(____)
# Report the best parameters and the corresponding score
print("Best CV params", searcher.best_params_)
print("Best CV accuracy", searcher.best_score_)
# Report the test accuracy using these best parameters
print("Test accuracy of best grid search hypers:", searcher.score(____))