BaşlayınÜcretsiz Başlayın

Activations: ReLU vs. ELU

The choice of the activation functions used in the model (combined with the corresponding weight initialization) can have a strong impact on the training process. In particular, the proper activation can prevent the network from experiencing unstable gradients problems.

In the previous exercise, you have switched from ReLU to ELU activations. Do you remember which characteristics of the two activations justify this change?

Bu egzersiz

Intermediate Deep Learning with PyTorch

kursunun bir parçasıdır
Kursu Görüntüle

Uygulamalı interaktif egzersiz

İnteraktif egzersizlerimizden biriyle teoriyi pratiğe dökün

Egzersizi başlat