Implementing ReLU
The Rectified Linear Unit (ReLU) is a widely-used activation function in deep learning, solving challenges like the vanishing gradients problem.
In this exercise, you'll implement ReLU in PyTorch, apply it to both positive and negative values, and observe the results.
torch.nn package has already been imported for you as nn.
Latihan ini adalah bagian dari kursus
Introduction to Deep Learning with PyTorch
Latihan interaktif praktis
Cobalah latihan ini dengan menyelesaikan kode contoh berikut.
# Create a ReLU function with PyTorch
relu_pytorch = ____