Get Started

Using `tune.svm()`

This exercise will give you hands-on practice with using the tune.svm() function. You will use it to obtain the optimal values for the cost, gamma, and coef0 parameters for an SVM model based on the radially separable dataset you created earlier in this chapter. The training data is available in the dataframe trainset, the test data in testset, and the e1071 library has been preloaded for you. Remember that the class variable y is stored in the third column of the trainset and testset.

Also recall that in the video, Kailash used cost=10^(1:3) to get a range of the cost parameter from 10=10^1 to 1000=10^3 in multiples of 10.

This is a part of the course

“Support Vector Machines in R”

View Course

Exercise instructions

  • Set parameter search ranges as follows:
    • cost - from 0.1 (10^(-1)) to 100 (10^2) in multiples of 10.
    • gamma and coef0 - one of the following values: 0.1, 1 and 10.

Hands-on interactive exercise

Have a go at this exercise by completing this sample code.

#tune model
tune_out <- 
    tune.svm(x = trainset[, -3], y = trainset[, 3], 
             type = "C-classification", 
             kernel = "polynomial", degree = 2, cost = 10^(___:___), 
             gamma = c(___, ___, ___), coef0 = c(0.1, 1, 10))

#list optimal values
tune_out$best.parameters$___
tune_out$best.parameters$___
tune_out$best.parameters$___

This exercise is part of the course

Support Vector Machines in R

IntermediateSkill Level
4.5+
6 reviews

This course will introduce the support vector machine (SVM) using an intuitive, visual approach.

Provides an introduction to polynomial kernels via a dataset that is radially separable (i.e. has a circular decision boundary). After demonstrating the inadequacy of linear kernels for this dataset, students will see how a simple transformation renders the problem linearly separable thus motivating an intuitive discussion of the kernel trick. Students will then apply the polynomial kernel to the dataset and tune the resulting classifier.

Exercise 1: Generating a radially separable datasetExercise 2: Generating a 2d radially separable datasetExercise 3: Visualizing the datasetExercise 4: Linear SVMs on radially separable dataExercise 5: Linear SVM for a radially separable datasetExercise 6: Average accuracy for linear SVMExercise 7: The kernel trickExercise 8: Visualizing transformed radially separable dataExercise 9: SVM with polynomial kernelExercise 10: Tuning SVMsExercise 11: Using `tune.svm()`
Exercise 12: Building and visualizing the tuned model

What is DataCamp?

Learn the data skills you need online at your own pace—from non-coding essentials to data science and machine learning.

Start Learning for Free