1. Learn
  2. /
  3. Courses
  4. /
  5. Intermediate Deep Learning with PyTorch

Exercise

Optimizers

It's time to explore the different optimizers that you can use for training your model.

A custom function called train_model(optimizer, net, num_epochs) has been defined for you. It takes the optimizer, the model, and the number of epochs as inputs, runs the training loops, and prints the training loss at the end.

Let's use train_model() to run a few short trainings with different optimizers and compare the results!

Instructions 1/3

undefined XP
  • 1
    • Define the optimizer as Stochastic Gradient Descent.
  • 2
    • Define the optimizer as Root Mean Square Propagation (RMSprop), passing the model's parameters as its first argument.
  • 3
    • Define the optimizer as Adaptive Moments Estimation (Adam), setting the learning rate to 0.001.