Try a longer tune length
Recall from the video that random forest models have a primary tuning parameter of mtry
, which controls how many variables are exposed to the splitting search routine at each split. For example, suppose that a tree has a total of 10 splits and mtry = 2
. This means that there are 10 samples of 2 predictors each time a split is evaluated.
Use a larger tuning grid this time, but stick to the defaults provided by the train()
function. Try a tuneLength
of 3, rather than 1, to explore some more potential models, and plot the resulting model using the plot
function.
This exercise is part of the course
Machine Learning with caret in R
Exercise instructions
- Train a random forest model,
model
, using thewine
dataset on thequality
variable with all other variables as explanatory variables. (This will take a few seconds to run, so be patient!) - Use
method = "ranger"
. - Change the
tuneLength
to 3. - Use 5 CV folds.
- Print
model
to the console. - Plot the model after fitting it.
Hands-on interactive exercise
Have a go at this exercise by completing this sample code.
# Fit random forest: model
model <- train(
___,
tuneLength = 1,
data = ___,
method = ___,
trControl = trainControl(
method = "cv",
number = ___,
verboseIter = TRUE
)
)
# Print model to console
# Plot model