Aan de slagGa gratis aan de slag

Adafactor with Trainer

You're training a Transformer model with billions of parameters for your language translation service. It is straining your computational resources, so you decide to try the Adafactor optimizer to reduce memory requirements compared to AdamW. Prepare the Trainer for Adafactor!

Some training objects have been pre-loaded, including model, train_dataset, validation_dataset, and compute_metrics.

Deze oefening maakt deel uit van de cursus

Efficient AI Model Training with PyTorch

Cursus bekijken

Oefeninstructies

  • Specify Adafactor as an optimizer in TrainingArguments.
  • Pass in the optimizer state to print the size.

Praktische interactieve oefening

Probeer deze oefening eens door deze voorbeeldcode in te vullen.

# Specify Adafactor as an optimizer
training_args = TrainingArguments(output_dir="./results",
                                  evaluation_strategy="epoch",
                                  ____="____")

trainer = Trainer(model=model,
                  args=training_args,
                  train_dataset=train_dataset,
                  eval_dataset=validation_dataset,
                  compute_metrics=compute_metrics)
trainer.train()

# Pass in the optimizer state
total_size_megabytes, total_num_elements = compute_optimizer_size(____.____.____.values())
print(f"\nNumber of optimizer parameters: {total_num_elements:,}\nOptimizer size: {total_size_megabytes:.0f} MB")  
Code bewerken en uitvoeren