"Weak" decision tree
In the previous exercise you built two decision trees. Which one is fine-tuned and which one is "weak"?
Decision tree "A":
- min_samples_leaf = 3and- min_samples_split = 9
- F1-Score: ~58%
Decision tree "B":
- max_depth = 4and- max_features = 2
- F1-Score: ~53%
Both classifiers are available for you as clf_A and clf_B.
Cet exercice fait partie du cours
Ensemble Methods in Python
Exercice interactif pratique
Passez de la théorie à la pratique avec l’un de nos exercices interactifs
 Commencer l’exercice
Commencer l’exercice