Exercise

Boosting contest: Light vs Extreme

While the performance of the CatBoost model is relatively good, let's try two other flavors of boosting and see which performs better: the "Light" or the "Extreme" approach.

CatBoost is highly recommended when there are categorical features. In this case, all features are numeric, therefore one of the other approaches might produce better results.

As we are building regressors, we'll use an additional parameter, objective, which specifies the learning function to be used. To apply a squared error, we'll set objective to 'reg:squarederror' for XGBoost and 'mean_squared_error' for LightGBM.

In addition, we'll specify the parameter n_jobs for XGBoost to improve its computation runtime.

Instructions

100 XP
  • Build an XGBRegressor using the parameters: max_depth = 3, learning_rate = 0.1, n_estimators = 100, and n_jobs=2.
  • Build an LGBMRegressor using the parameters: max_depth = 3, learning_rate = 0.1, and n_estimators = 100.