Aan de slagGa gratis aan de slag

Adjust weights within the Voting Classifier

You've just seen that the Voting Classifier allows you to improve your fraud detection performance, by combining good aspects from multiple models. Now let's try to adjust the weights we give to these models. By increasing or decreasing weights you can play with how much emphasis you give to a particular model relative to the rest. This comes in handy when a certain model has overall better performance than the rest, but you still want to combine aspects of the others to further improve your results.

For this exercise the data is already split into a training and test set, and clf1, clf2 and clf3 are available and defined as before, i.e. they are the Logistic Regression, the Random Forest model and the Decision Tree respectively.

Deze oefening maakt deel uit van de cursus

Fraud Detection in Python

Cursus bekijken

Oefeninstructies

  • Define an ensemble method where you over weigh the second classifier (clf2) with 4 to 1 to the rest of the classifiers.
  • Fit the model to the training and test set, and obtain the predictions predicted from the ensemble model.
  • Print the performance metrics, this is ready for you to run.

Praktische interactieve oefening

Probeer deze oefening eens door deze voorbeeldcode in te vullen.

# Define the ensemble model
ensemble_model = ____(estimators=[('lr', clf1), ('rf', clf2), ('gnb', clf3)], voting='soft', weights=[____, ____, ____], flatten_transform=True)

# Get results 
get_model_results(X_train, y_train, X_test, y_test, ensemble_model)
Code bewerken en uitvoeren