CommencerCommencer gratuitement

Comparing activation functions

Comparing activation functions involves a bit of coding, but nothing you can't do!

You will try out different activation functions on the multi-label model you built for your farm irrigation machine in chapter 2. The function get_model('relu') returns a copy of this model and applies the 'relu' activation function to its hidden layer.

You will loop through several activation functions, generate a new model for each and train it. By storing the history callback in a dictionary you will be able to visualize which activation function performed best in the next exercise!

X_train, y_train, X_test, y_test are ready for you to use when training your models.

Cet exercice fait partie du cours

Introduction to Deep Learning with Keras

Afficher le cours

Instructions

  • Fill up the activation functions array with relu,leaky_relu, sigmoid, and tanh.
  • Get a new model for each iteration with get_model() passing the current activation function as a parameter.
  • Fit your model providing the train and validation_data, use 20 epochs and set verbose to 0.

Exercice interactif pratique

Essayez cet exercice en complétant cet exemple de code.

# Activation functions to try
activations = [____, ____, ____, ____]

# Loop over the activation functions
activation_results = {}

for act in activations:
  # Get a new model with the current activation
  model = ____
  # Fit the model and store the history results
  h_callback = ____
  activation_results[act] = h_callback
Modifier et exécuter le code