Session Ready
Exercise

Boosting contest: Light vs Extreme

While the performance of the CatBoost model is relatively good, let's try the two other flavors of boosting and see which one is better: the "Light" or the "Extreme" approach.

CatBoost is highly recommended when there are categorical features. In this case, all features are numeric, therefore one of the other approaches might perform better.

As we are building a regressor, we'll use an additional parameter, objective, which specifies the learning function to be used. To apply a squared error, we'll set objective to 'reg:squarederror' for XGBoost and 'mean_squared_error' for LightGBM.

Instructions
100 XP
  • Build an Extreme regression model, using the parameters: max_depth = 3, learning_rate = 0.1, and n_estimators = 100.
  • Build a Light regression model, with the same parameters as before.