Configuring the optimizer
Now that we have training logic, we need to specify how to optimize the model's parameters.
In this exercise, you'll complete the configure_optimizers
method within a PyTorch Lightning module used for image classification tasks. Your goal is to set up an optimizer that will update the model's parameters during training. To complete this you'll use the Adam optimizer with a learning rate of 1e-3
.
This exercise is part of the course
Scalable AI Models with PyTorch Lightning
Exercise instructions
- Create an Adam optimizer using the model's parameters, setting the learning rate to
1e-3
.
Hands-on interactive exercise
Have a go at this exercise by completing this sample code.
import torch
def configure_optimizers(self):
# Create an Adam optimizer for model parameters
optimizer = ____
return optimizer