Activations: ReLU vs. ELU
The choice of the activation functions used in the model (combined with the corresponding weight initialization) can have a strong impact on the training process. In particular, the proper activation can prevent the network from experiencing unstable gradients problems.
In the previous exercise, you have switched from ReLU to ELU activations. Do you remember which characteristics of the two activations justify this change?
This exercise is part of the course
Intermediate Deep Learning with PyTorch
Hands-on interactive exercise
Turn theory into action with one of our interactive exercises
