1. Learn
  2. /
  3. Courses
  4. /
  5. Intermediate Deep Learning with PyTorch

Exercise

Training multi-output models

When training models with multiple outputs, it is crucial to ensure that the loss function is defined correctly.

In this case, the model produces two outputs: predictions for the alphabet and the character. For each of these, there are corresponding ground truth labels, which will allow you to calculate two separate losses: one incurred from incorrect alphabet classifications, and the other from incorrect character classification. Since in both cases you are dealing with a multi-label classification task, the Cross-Entropy loss can be applied each time.

Gradient descent can optimize only one loss function, however. You will thus define the total loss as the sum of alphabet and character losses.

Instructions

100 XP
  • Calculate the alphabet classification loss and assign it to loss_alpha.
  • Calculate the character classification loss and assign it to loss_char.
  • Compute the total loss as the sum of the two partial losses and assign it to loss.