Get startedGet started for free

Final Thoughts

1. Final Thoughts

Congratulations on completing this course. Let's go over everything we've covered in this course, as well as where you can go from here with learning other topics related to XGBoost that we didn't have a chance to cover.

2. What We Have Covered And You Have Learned

So, what have we been able to cover in this course? Well, we've learned how to use XGBoost for both classification and regression tasks. We've also covered all the most important hyperparameters that you should tune when creating XGBoost models, so that they are as performant as possible. And we just finished up how to incorporate XGBoost into pipelines, and used some more advanced functions that allow us to seamlessly work with Pandas DataFrames and scikit-learn. That's quite a lot of ground we've covered and you should be proud of what you've been able to accomplish.

3. What We Have Not Covered (And How You Can Proceed)

However, although we've covered quite a lot, we didn't cover some other topics that would advance your mastery of XGBoost. Specifically, we never looked into how to use XGBoost for ranking or recommendation problems, which can be done by modifying the loss function you use when constructing your model. We also didn't look into more advanced hyperparameter selection strategies. The most powerful strategy, called Bayesian optimization, has been used with lots of success, and entire companies have been created just for specifically using this method in tuning models (for example, the company sigopt does exactly this). It's a powerful method, but would take an entire other DataCamp course to teach properly! Finally, we haven't talked about ensembling XGBoost with other models. Although XGBoost is itself an ensemble method, nothing stops you from combining the predictions you get from an XGBoost model with other models, as this is usually a very powerful additional way to squeeze the last bit of juice from your data. Learning about all of these additional topics will help you become an even more powerful user of XGBoost. Now that you know your way around the package, there's no reason for you to stop learning how to get even more benefits out of it.

4. Congratulations!

I hope you've enjoyed taking this course on XGBoost as I have teaching it. Please let us know if you've enjoyed the course and definitely let me know how I can improve it. It's been a pleasure, and I hope you continue your data science journey from here!

Create Your Free Account

or

By continuing, you accept our Terms of Use, our Privacy Policy and that your data is stored in the USA.