"Weak" decision tree
In the previous exercise you built two decision trees. Which one is fine-tuned and which one is "weak"?
Decision tree "A":
min_samples_leaf = 3andmin_samples_split = 9- F1-Score: ~58%
Decision tree "B":
max_depth = 4andmax_features = 2- F1-Score: ~53%
Both classifiers are available for you as clf_A and clf_B.
Diese Übung ist Teil des Kurses
Ensemble Methods in Python
Interaktive Übung
In dieser interaktiven Übung kannst du die Theorie in die Praxis umsetzen.
Übung starten