1. Learn
  2. /
  3. Courses
  4. /
  5. Machine Learning with Tree-Based Models in R

Connected

Exercise

Create the folds

Splitting data only once into training and test sets has statistical insecurities - there is a small chance that your test set contains only high-rated beans, while all the low-rated beans are in your training set. It also means that you can only measure the performance of your model once.

Cross-validation gives you a more robust estimate of your out-of-sample performance without the statistical pitfalls - it assesses your model more profoundly.

In this exercise, you will create folds of your training data chocolate_train, which is pre-loaded.

Instructions

100 XP
  • Set a seed of 20 for reproducibility.
  • Create 10 folds of chocolate_train and save the result as chocolate_folds.