Implementing ReLU
The Rectified Linear Unit (ReLU) is a widely-used activation function in deep learning, solving challenges like the vanishing gradients problem.
In this exercise, you'll implement ReLU in PyTorch, apply it to both positive and negative values, and observe the results.
torch.nn package has already been imported for you as nn.
Deze oefening maakt deel uit van de cursus
Introduction to Deep Learning with PyTorch
Praktische interactieve oefening
Probeer deze oefening eens door deze voorbeeldcode in te vullen.
# Create a ReLU function with PyTorch
relu_pytorch = ____