Session Ready
Exercise

Exploding gradient problem

In the video exercise, you learned about two problems that may arise when working with RNN models: the vanishing and exploding gradient problems.

This exercise explores the exploding gradient problem, showing that the derivative of a function can increase exponentially, and how to solve it with a simple technique.

The data is already loaded on the environment as X_train, X_test, y_train and y_test.

You will use a Stochastic Gradient Descent (SGD) optimizer and Mean Squared Error (MSE) as the loss function.

In the first step you will observe the gradient exploding by computing the MSE on the train and test sets. On step 2, you will change the optimizer using the clipvalue parameter to solve the problem.

The Stochastic Gradient Descent in Keras is loaded as SGD.

Instructions 1/2
undefined XP
  • 1
  • 2
  • Use SGD() as optimizer and (X_test, y_test) as validation data.
  • Evaluate train performance and print all the MSE values.