Get startedGet started for free

Wrap-up

1. Recap: Anomaly Detection in R

You've now completed anomaly detection in R, well done! Let's review the course highlights.

2. Course summary

In chapter one, we explored graphical summaries and univariate tests for finding outliers. This included Grubbs' test and the Seasonal-Hybrid ESD algorithm. Chapter two introduced kNN and LOF which are algorithms for constructing anomaly scores based on how far away or how densely packed the nearest neighbors are. In chapter three, you practiced using the isolation forest algorithm which is a fast tree-based approach to creating anomaly scores based on how easily points can be separated using random splits. Finally, chapter four you learned how to use precision and recall to compare the performance of anomaly detection algorithms when anomaly labels are available, and how to cope with categorical features.

3. What's next?

You've built a great foundation by completing the course. So what's next? Each of the parameters like $k$ in kNN influence the anomaly score. A next step is to learn how to choose good values for these parameters. The course covered some popular techniques, but there are many others to explore including the one-class support vector machine, and clustering approaches.

4. Congratulations!

Congratulations on completing the course!