BaşlayınÜcretsiz Başlayın

AdamW with Trainer

You're beginning to train a Transformer model to simplify language translations. As a first step, you decide to use the AdamW optimizer as a benchmark and the Trainer interface for quick setup. Set up Trainer to use the AdamW optimizer.

AdamW has been pre-imported from torch.optim. Some training objects have been pre-loaded: model, training_args, train_dataset, validation_dataset, compute_metrics.

Bu egzersiz

Efficient AI Model Training with PyTorch

kursunun bir parçasıdır
Kursu Görüntüle

Egzersiz talimatları

  • Pass the model parameters to the AdamW optimizer.
  • Pass the optimizer to Trainer.

Uygulamalı interaktif egzersiz

Bu örnek kodu tamamlayarak bu egzersizi bitirin.

# Pass the model parameters to the AdamW optimizer
optimizer = ____(params=____.____())

# Pass the optimizer to Trainer
trainer = Trainer(model=model,
                  args=training_args,
                  train_dataset=train_dataset,
                  eval_dataset=validation_dataset,
                  ____=(____, None),
                  compute_metrics=compute_metrics)

trainer.train()
Kodu Düzenle ve Çalıştır