5 x 5-fold cross-validation
You can do more than just one iteration of cross-validation. Repeated cross-validation gives you a better estimate of the test-set error. You can also repeat the entire cross-validation procedure. This takes longer, but gives you many more out-of-sample datasets to look at and much more precise assessments of how well the model performs.
One of the awesome things about the train()
function in caret
is how easy it is to run very different models or methods of cross-validation just by tweaking a few simple arguments to the function call. For example, you could repeat your entire cross-validation procedure 5 times for greater confidence in your estimates of the model's out-of-sample accuracy, e.g.:
trControl = trainControl(
method = "repeatedcv",
number = 5,
repeats = 5,
verboseIter = TRUE
)
This exercise is part of the course
Machine Learning with caret in R
Exercise instructions
- Re-fit the linear regression model to the
Boston
housing dataset. - Use 5 repeats of 5-fold cross-validation.
- Print the model to the console.
Hands-on interactive exercise
Have a go at this exercise by completing this sample code.
# Fit lm model using 5 x 5-fold CV: model
model <- train(
medv ~ .,
Boston,
method = "lm",
trControl = trainControl(
method = "repeatedcv",
number = ___,
repeats = ___,
verboseIter = TRUE
)
)
# Print model to console