1. Learn
  2. /
  3. Courses
  4. /
  5. Introduction to Deep Learning with PyTorch

Exercise

Implementing ReLU

The rectified linear unit (or ReLU) function is one of the most common activation functions in deep learning.

It overcomes the training problems linked with the sigmoid function you learned, such as the vanishing gradients problem.

In this exercise, you'll begin with a ReLU implementation in PyTorch. Next, you'll calculate the gradients of the function.

The nn module has already been imported for you.

Instructions 1/2

undefined XP
    1
    2
  • Create a ReLU function in PyTorch.