AdamW with Trainer
You're beginning to train a Transformer model to simplify language translations. As a first step, you decide to use the AdamW optimizer as a benchmark and the Trainer interface for quick setup. Set up Trainer to use the AdamW optimizer.
AdamW has been pre-imported from torch.optim. Some training objects have been pre-loaded: model, training_args, train_dataset, validation_dataset, compute_metrics.
Deze oefening maakt deel uit van de cursus
Efficient AI Model Training with PyTorch
Oefeninstructies
- Pass the
modelparameters to theAdamWoptimizer. - Pass the
optimizertoTrainer.
Praktische interactieve oefening
Probeer deze oefening eens door deze voorbeeldcode in te vullen.
# Pass the model parameters to the AdamW optimizer
optimizer = ____(params=____.____())
# Pass the optimizer to Trainer
trainer = Trainer(model=model,
args=training_args,
train_dataset=train_dataset,
eval_dataset=validation_dataset,
____=(____, None),
compute_metrics=compute_metrics)
trainer.train()