BaşlayınÜcretsiz Başlayın

Calculating cross entropy loss

Cross-entropy loss is a widely used method to measure classification loss. In this exercise, you’ll calculate cross-entropy loss in PyTorch using:

  • y: the ground truth label.
  • scores: a vector of predictions before softmax.

Loss functions help neural networks learn by measuring prediction errors. Create a one-hot encoded vector for y, define the cross-entropy loss function, and compute the loss using scores and the encoded label. The result will be a single float representing the sample's loss.

torch, CrossEntropyLoss, and torch.nn.functional as F have already been imported for you.

Bu egzersiz

Introduction to Deep Learning with PyTorch

kursunun bir parçasıdır
Kursu Görüntüle

Uygulamalı interaktif egzersiz

Bu örnek kodu tamamlayarak bu egzersizi bitirin.

import torch
import torch.nn.functional as F
from torch.nn import CrossEntropyLoss

y = [2]
scores = torch.tensor([[0.1, 6.0, -2.0, 3.2]])

# Create a one-hot encoded vector of the label y
one_hot_label = F.____(torch.____(____), num_classes=____)
Kodu Düzenle ve Çalıştır