Session Ready
Exercise

Simple network using Keras

By now you have an intuitive understanding of how the gradient values become lesser and lesser as we back-propagate. In this exercise, you'll work on an example to demonstrate this vanishing gradient problem. You'll create a simple network of Dense layers using Keras and checkout the gradient values of the weights for one iteration of back-propagation.

The Sequential model and the Dense and Activation layers are already imported from Keras. The Keras module backend is also imported. This has a method .gradients() that can be used to get the gradient values of the weights.

Instructions 1/2
undefined XP
  • 1
  • 2
  • Create a Sequential model.
  • Add a Dense layer of 12 units with 'relu' activation, 'uniform' initialization and input dimension of 8 to the model
  • Add a Dense layer of 8 units with 'relu' activation, 'uniform' initialization to the model.