Implementing ReLU
The Rectified Linear Unit (ReLU) is a widely-used activation function in deep learning, solving challenges like the vanishing gradients problem.
In this exercise, you'll implement ReLU in PyTorch, apply it to both positive and negative values, and observe the results.
torch.nn package has already been imported for you as nn.
Questo esercizio fa parte del corso
Introduction to Deep Learning with PyTorch
Esercizio pratico interattivo
Prova a risolvere questo esercizio completando il codice di esempio.
# Create a ReLU function with PyTorch
relu_pytorch = ____