MulaiMulai sekarang secara gratis

Training multi-output models

When training models with multiple outputs, it is crucial to ensure that the loss function is defined correctly.

In this case, the model produces two outputs: predictions for the alphabet and the character. For each of these, there are corresponding ground truth labels, which will allow you to calculate two separate losses: one incurred from incorrect alphabet classifications, and the other from incorrect character classification. Since in both cases you are dealing with a multi-label classification task, the Cross-Entropy loss can be applied each time.

Gradient descent can optimize only one loss function, however. You will thus define the total loss as the sum of alphabet and character losses.

Latihan ini adalah bagian dari kursus

Intermediate Deep Learning with PyTorch

Lihat Kursus

Petunjuk latihan

  • Calculate the alphabet classification loss and assign it to loss_alpha.
  • Calculate the character classification loss and assign it to loss_char.
  • Compute the total loss as the sum of the two partial losses and assign it to loss.

Latihan interaktif praktis

Cobalah latihan ini dengan menyelesaikan kode contoh berikut.

net = Net()
criterion = nn.CrossEntropyLoss()
optimizer = optim.SGD(net.parameters(), lr=0.05)

for epoch in range(1):
    for images, labels_alpha, labels_char in dataloader_train:
        optimizer.zero_grad()
        outputs_alpha, outputs_char = net(images)
        # Compute alphabet classification loss
        loss_alpha = ____
        # Compute character classification loss
        loss_char = ____
        # Compute total loss
        loss = ____
        loss.backward()
        optimizer.step()
Edit dan Jalankan Kode