1. Learn
  2. /
  3. Courses
  4. /
  5. Efficient AI Model Training with PyTorch

Connected

Exercise

AdamW with Trainer

You're beginning to train a Transformer model to simplify language translations. As a first step, you decide to use the AdamW optimizer as a benchmark and the Trainer interface for quick setup. Set up Trainer to use the AdamW optimizer.

AdamW has been pre-imported from torch.optim. Some training objects have been pre-loaded: model, training_args, train_dataset, validation_dataset, compute_metrics.

Instructions

100 XP
  • Pass the model parameters to the AdamW optimizer.
  • Pass the optimizer to Trainer.