BaşlayınÜcretsiz Başlayın

Implementing ReLU

The Rectified Linear Unit (ReLU) is a widely-used activation function in deep learning, solving challenges like the vanishing gradients problem.

In this exercise, you'll implement ReLU in PyTorch, apply it to both positive and negative values, and observe the results.

torch.nn package has already been imported for you as nn.

Bu egzersiz

Introduction to Deep Learning with PyTorch

kursunun bir parçasıdır
Kursu Görüntüle

Uygulamalı interaktif egzersiz

Bu örnek kodu tamamlayarak bu egzersizi bitirin.

# Create a ReLU function with PyTorch
relu_pytorch = ____
Kodu Düzenle ve Çalıştır