BaşlayınÜcretsiz Başlayın

The sigmoid and softmax functions

The sigmoid and softmax functions are key activation functions in deep learning, often used as the final step in a neural network.

  • Sigmoid is for binary classification
  • Softmax is for multi-class classification

Given a pre-activation output tensor from a network, apply the appropriate activation function to obtain the final output.

torch.nn has already been imported as nn.

Bu egzersiz

Introduction to Deep Learning with PyTorch

kursunun bir parçasıdır
Kursu Görüntüle

Uygulamalı interaktif egzersiz

Bu örnek kodu tamamlayarak bu egzersizi bitirin.

input_tensor = torch.tensor([[2.4]])

# Create a sigmoid function and apply it on input_tensor
sigmoid = nn.____()
probability = ____(____)
print(probability)
Kodu Düzenle ve Çalıştır