Exercise

Batch normalizing a familiar model

Remember the digits dataset you trained in the first exercise of this chapter?

A multi-class classification problem that you solved using softmax and 10 neurons in your output layer.

You will now build a new deeper model consisting of 3 hidden layers of 50 neurons each, using batch normalization in between layers. The kernel_initializer parameter is used to initialize weights in a similar way.

Instructions

100 XP
  • Import BatchNormalization from tensorflow.keras layers.
  • Build your deep network model, use 50 neurons for each hidden layer adding batch normalization in between layers.
  • Compile your model with stochastic gradient descent, sgd, as an optimizer.