Exercise

# Joblib

In the last exercise of this course, we will use the grid search technique to find the optimal hyperparameters for an elastic net model.

Grid search is computationally intensive. To speed up the search, we will use the `joblib`

`parallel_backend()`

function.

The scikit-learn `GridSearchCV`

class has already been instantiated as `engrid`

with a grid of two hyperparameters:

`l1_ratio`

: the mix of Lasso (L1) and Ridge (L2) regression penalties used to shrink model coefficients`alpha`

: the severity of the penalty

Applying penalties to model coefficients helps to avoid overfitting and produce models that perform better on new data.

We will use the optimal `l1_ratio`

to create a `enet_path()`

plot that shows how coefficients shrink as `alpha`

increases.

Instructions

**100 XP**

- Setup a Dask client with one worker and four threads.
- Run the grid search using
`joblib`

and a Dask parallel backend.