Changing optimization parameters
It's time to get your hands dirty with optimization. You'll now try optimizing a model at a very low learning rate, a very high learning rate, and a "just right" learning rate. You'll want to look at the results after running this exercise, remembering that a low value for the loss function is good.
For these exercises, we've pre-loaded the predictors and target values from your previous classification models (predicting who would survive on the Titanic). You'll want the optimization to start from scratch every time you change the learning rate, to give a fair comparison of how each learning rate did in your results. So we have created a function get_new_model()
that creates an unoptimized model to optimize.
This exercise is part of the course
Introduction to Deep Learning in Python
Exercise instructions
- Import
SGD
fromtensorflow.keras.optimizers
. - Create a list of learning rates to try optimizing with called
lr_to_test
. The learning rates in it should be.000001
,0.01
, and1
. - Using a
for
loop to iterate overlr_to_test
:- Use the
get_new_model()
function to build a new, unoptimized model. - Create an optimizer called
my_optimizer
using theSGD()
constructor with keyword argumentlr=lr
. - Compile your model. Set the
optimizer
parameter to be the SGD object you created above, and because this is a classification problem, use'categorical_crossentropy'
for theloss
parameter. - Fit your model using the
predictors
andtarget
.
- Use the
Hands-on interactive exercise
Have a go at this exercise by completing this sample code.
# Import the SGD optimizer
____
# Create list of learning rates: lr_to_test
lr_to_test = ____
# Loop over learning rates
for lr in lr_to_test:
print('\n\nTesting model with learning rate: %f\n'%lr )
# Build new model to test, unaffected by previous models
model = ____
# Create SGD optimizer with specified learning rate: my_optimizer
my_optimizer = ____
# Compile the model
____
# Fit the model
____