IniziaInizia gratis

Voting Classifier

Let's now combine three machine learning models into one, to improve our Random Forest fraud detection model from before. You'll combine our usual Random Forest model, with the Logistic Regression from the previous exercise, with a simple Decision Tree. You can use the short cut get_model_results() to see the immediate result of the ensemble model.

Questo esercizio fa parte del corso

Fraud Detection in Python

Visualizza il corso

Istruzioni dell'esercizio

  • Import the Voting Classifier package.
  • Define the three models; use the Logistic Regression from before, the Random Forest from previous exercises and a Decision tree with balanced class weights.
  • Define the ensemble model by inputting the three classifiers with their respective labels.

Esercizio pratico interattivo

Prova a risolvere questo esercizio completando il codice di esempio.

# Import the package
from sklearn.ensemble import ____

# Define the three classifiers to use in the ensemble
clf1 = LogisticRegression(class_weight={0:1, 1:15}, random_state=5)
clf2 = ____(class_weight={0:1, 1:12}, criterion='gini', max_depth=8, max_features='log2',
            min_samples_leaf=10, n_estimators=30, n_jobs=-1, random_state=5)
clf3 = DecisionTreeClassifier(random_state=5, class_weight="____")

# Combine the classifiers in the ensemble model
ensemble_model = ____(estimators=[('lr', ____), ('rf', ____), ('dt', ____)], voting='hard')

# Get the results 
get_model_results(X_train, y_train, X_test, y_test, ensemble_model)
Modifica ed esegui il codice