MulaiMulai sekarang secara gratis

"Weak" decision tree

In the previous exercise you built two decision trees. Which one is fine-tuned and which one is "weak"?

Decision tree "A":

  • min_samples_leaf = 3 and min_samples_split = 9
  • F1-Score: ~58%

Decision tree "B":

  • max_depth = 4 and max_features = 2
  • F1-Score: ~53%

Both classifiers are available for you as clf_A and clf_B.

Latihan ini adalah bagian dari kursus

Ensemble Methods in Python

Lihat Kursus

Latihan interaktif praktis

Ubah teori menjadi tindakan dengan salah satu latihan interaktif kami.

Mulai berolahraga