1. 学习
  2. /
  3. 课程
  4. /
  5. Introduction to Deep Learning with PyTorch

Connected

练习

Calculating cross entropy loss

Cross-entropy loss is a widely used method to measure classification loss. In this exercise, you’ll calculate cross-entropy loss in PyTorch using:

  • y: the ground truth label.
  • scores: a vector of predictions before softmax.

Loss functions help neural networks learn by measuring prediction errors. Create a one-hot encoded vector for y, define the cross-entropy loss function, and compute the loss using scores and the encoded label. The result will be a single float representing the sample's loss.

torch, CrossEntropyLoss, and torch.nn.functional as F have already been imported for you.

说明 1 / 共 3 个

undefined XP
    1
    2
    3
  • Create the one-hot encoded vector of the ground truth label y, with 4 features (one for each class), and assign it to one_hot_label.