Aan de slagGa gratis aan de slag

Calculating cross entropy loss

Cross-entropy loss is a widely used method to measure classification loss. In this exercise, you’ll calculate cross-entropy loss in PyTorch using:

  • y: the ground truth label.
  • scores: a vector of predictions before softmax.

Loss functions help neural networks learn by measuring prediction errors. Create a one-hot encoded vector for y, define the cross-entropy loss function, and compute the loss using scores and the encoded label. The result will be a single float representing the sample's loss.

torch, CrossEntropyLoss, and torch.nn.functional as F have already been imported for you.

Deze oefening maakt deel uit van de cursus

Introduction to Deep Learning with PyTorch

Cursus bekijken

Praktische interactieve oefening

Probeer deze oefening eens door deze voorbeeldcode in te vullen.

import torch
import torch.nn.functional as F
from torch.nn import CrossEntropyLoss

y = [2]
scores = torch.tensor([[0.1, 6.0, -2.0, 3.2]])

# Create a one-hot encoded vector of the label y
one_hot_label = F.____(torch.____(____), num_classes=____)
Code bewerken en uitvoeren