Calculating cross entropy loss
Cross-entropy loss is a widely used method to measure classification loss. In this exercise, you’ll calculate cross-entropy loss in PyTorch using:
y: the ground truth label.scores: a vector of predictions before softmax.
Loss functions help neural networks learn by measuring prediction errors. Create a one-hot encoded vector for y, define the cross-entropy loss function, and compute the loss using scores and the encoded label. The result will be a single float representing the sample's loss.
torch, CrossEntropyLoss, and torch.nn.functional as F have already been imported for you.
Latihan ini adalah bagian dari kursus
Introduction to Deep Learning with PyTorch
Latihan interaktif praktis
Cobalah latihan ini dengan menyelesaikan kode contoh berikut.
import torch
import torch.nn.functional as F
from torch.nn import CrossEntropyLoss
y = [2]
scores = torch.tensor([[0.1, 6.0, -2.0, 3.2]])
# Create a one-hot encoded vector of the label y
one_hot_label = F.____(torch.____(____), num_classes=____)