ComeçarComece de graça

XG Boost

In this exercise, you'll practice yet another boosting technique. Dubbed the new queen of Machine Learning, XGBoost is an optimized distributed gradient boosting package that is "taking over the world!". That said, getting asked about it in a Machine Learning interview is likely or, at the very least, would be to your benefit to discuss in one of your answers to display your knowledge of cutting-edge and highly accurate algorithms.

The argument learning_rate=0.1 specifies the size of the step to take in each iteration while searching for the global minimum and max_depth controls the size (depth) of the decision trees, here 3.

All relevant packages have been imported for you: pandas as pd, train_test_split from sklearn.model_selection, accuracy_score from sklearn.linear_model, LogisticRegression from sklearn.linear_model, BaggingClassifier and AdaBoostClassifier from sklearn.ensemble, and XGBClassifier from xgboost.

The loan_data DataFrame is already split into X_train, X_test, y_train and y_test.

Este exercício faz parte do curso

Practicing Machine Learning Interview Questions in Python

Ver curso

Exercício interativo prático

Experimente este exercício completando este código de exemplo.

# Instantiate
xgb = ____(____=____, random_state=123, learning_rate=0.1, max_depth=3)
Editar e executar o código