Random search with XGBoost
Often, GridSearchCV can be really time consuming, so in practice, you may want to use RandomizedSearchCV instead, as you will do in this exercise. The good news is you only have to make a few modifications to your GridSearchCV code to do RandomizedSearchCV. The key difference is you have to specify a param_distributions parameter instead of a param_grid parameter.
Este ejercicio forma parte del curso
Extreme Gradient Boosting with XGBoost
Instrucciones del ejercicio
- Create a parameter grid called 
gbm_param_gridthat contains a list with a single value for'n_estimators'(25), and a list of'max_depth'values between2and11for'max_depth'- userange(2, 12)for this. - Create a 
RandomizedSearchCVobject calledrandomized_mse, passing in: the parameter grid toparam_distributions, theXGBRegressortoestimator,"neg_mean_squared_error"toscoring,5ton_iter, and4tocv. Also specifyverbose=1so you can better understand the output. - Fit the 
RandomizedSearchCVobject toXandy. 
Ejercicio interactivo práctico
Prueba este ejercicio y completa el código de muestra.
# Create the parameter grid: gbm_param_grid 
gbm_param_grid = {
    '____': [____],
    '____': ____(____, ____)
}
# Instantiate the regressor: gbm
gbm = xgb.XGBRegressor(n_estimators=10)
# Perform random search: grid_mse
randomized_mse = ____
# Fit randomized_mse to the data
____
# Print the best parameters and lowest RMSE
print("Best parameters found: ", randomized_mse.best_params_)
print("Lowest RMSE found: ", np.sqrt(np.abs(randomized_mse.best_score_)))