1. Learn
  2. /
  3. Courses
  4. /
  5. Efficient AI Model Training with PyTorch

Connected

Exercise

8-bit Adam with Accelerator

You would like to customize your training loop with 8-bit Adam to reduce memory requirements of your model. Prepare the loop with 8-bit Adam for training.

Assume that an 8-bit Adam optimizer has been defined as adam_bnb_optim. Other training objects have been defined: model, train_dataloader, lr_scheduler, and accelerator.

Instructions

100 XP
  • Prepare the 8-bit Adam optimizer for distributed training.
  • Update the model parameters with the optimizer.
  • Zero the gradients with the optimizer.