Get startedGet started for free

Bigger mistakes, bigger penalty

All errors are wrong, but not all are equally bad. Sometimes large prediction errors are disproportionately more harmful than small errors.

Bigger mistakes, bigger penalty - that’s one of the features of the root mean squared error or RMSE. It squares large errors, which punishes these outliers more harshly than smaller errors.

RMSE can be calculated using the following formula, where the \(i\) th squared_diff is the square of the \(i\) th error.

$$RMSE = \sqrt{\frac{1}{n} \cdot \sum_{i=1} ^n i\text{th squared_diff}}$$

In this exercise, you will compute the RMSE of your predictions.

Available in your workspace is the result of the last exercise, test_enriched, the test data with a new column .pred, the model's out-of-sample predictions.

This exercise is part of the course

Machine Learning with Tree-Based Models in R

View Course

Exercise instructions

  • Calculate the component-wise differences of the predictions and the final grades, square them, and save as squared_diffs.
  • Use the formula above to calculate the RMSE and save it as rmse_manual.
  • Use the rmse() function to calculate the error and save as rmse_auto.
  • Print rmse_manual and rmse_auto to verify that they are the same.

Hands-on interactive exercise

Have a go at this exercise by completing this sample code.

# Calculate the squared differences
squared_diffs <- (___ - ___)^___

# Compute the RMSE using the formula
rmse_manual <- ___(1 / ___ * ___)

# Compute the RMSE using a function
rmse_auto <- ___(___,
                 ___,
                 ___)

# Print both errors
___
___
Edit and Run Code