Get startedGet started for free

Congratulations!

1. Congratulations!

Congratulations! You've made it all the way through this dimensionality reduction deep dive.

2. What you've learned

You now know why dimensionality reduction is important and when to use it. You also understand the difference between feature selection and feature extraction and you've added multiple options to your toolkit to apply both. When you need to explore a new high dimensional dataset you can use techniques like t-SNE and PCA to find the strongest patterns in the data. When you want to use a dataset to predict some target feature you now know how to let models help you find the most important features in a dataset for that task. And you've got multiple tricks up your sleeve to remove the unimportant features. While you now have a solid foundation on dimensionality reduction there is always more to learn on this topic, for instance, we've been focusing on numerical data in this course and this has come at the cost of giving categorical features and text data a bit of a cold treatment. I encourage you to have a look at the specialized techniques for those data types.

3. Thank you!

But for now, you deserve a break and my compliments for what you've achieved. I thank you for staying with me throughout this course and hope you've enjoyed it. Goodbye!