Using SGDClassifier

In this final coding exercise, you'll do a hyperparameter search over the regularization strength and the loss (logistic regression vs. linear SVM) using SGDClassifier().

This exercise is part of the course

Linear Classifiers in Python

View Course

Exercise instructions

  • Instantiate an SGDClassifier instance with random_state=0.
  • Search over the regularization strength and the hinge vs. log_loss losses.

Hands-on interactive exercise

Have a go at this exercise by completing this sample code.

# We set random_state=0 for reproducibility 
linear_classifier = ____(random_state=0)

# Instantiate the GridSearchCV object and run the search
parameters = {'alpha':[0.00001, 0.0001, 0.001, 0.01, 0.1, 1], 
             'loss':[____]}
searcher = GridSearchCV(linear_classifier, parameters, cv=10)
searcher.fit(X_train, y_train)

# Report the best parameters and the corresponding score
print("Best CV params", searcher.best_params_)
print("Best CV accuracy", searcher.best_score_)
print("Test accuracy of best grid search hypers:", searcher.score(X_test, y_test))