Aan de slagGa gratis aan de slag

Logistic regression vs classification tree

A classification tree divides the feature space into rectangular regions. In contrast, a linear model such as logistic regression produces only a single linear decision boundary dividing the feature space into two decision regions.

We have written a custom function called plot_labeled_decision_regions() that you can use to plot the decision regions of a list containing two trained classifiers. You can type help(plot_labeled_decision_regions) in the shell to learn more about this function.

X_train, X_test, y_train, y_test, the model dt that you've trained in an earlier exercise, as well as the function plot_labeled_decision_regions() are available in your workspace.

Deze oefening maakt deel uit van de cursus

Machine Learning with Tree-Based Models in Python

Cursus bekijken

Oefeninstructies

  • Import LogisticRegression from sklearn.linear_model.

  • Instantiate a LogisticRegression model and assign it to logreg.

  • Fit logreg to the training set.

  • Review the plot generated by plot_labeled_decision_regions().

Praktische interactieve oefening

Probeer deze oefening eens door deze voorbeeldcode in te vullen.

# Import LogisticRegression from sklearn.linear_model
from ____.____ import  ____

# Instatiate logreg
____ = ____(random_state=1)

# Fit logreg to the training set
____.____(____, ____)

# Define a list called clfs containing the two classifiers logreg and dt
clfs = [logreg, dt]

# Review the decision regions of the two classifiers
plot_labeled_decision_regions(X_test, y_test, clfs)
Code bewerken en uitvoeren