1. Learn
  2. /
  3. Courses
  4. /
  5. Introduction to Deep Learning with PyTorch

Connected

Exercise

Using the PyTorch optimizer

Earlier, you manually updated the weight of a network, gaining insight into how training works behind the scenes. However, this method isn’t scalable for deep networks with many layers.

Thankfully, PyTorch provides the SGD optimizer, which automates this process efficiently in just a few lines of code. Now, you’ll complete the training loop by updating the weights using a PyTorch optimizer.

A neural network has been created and provided as the model variable. This model was used to run a forward pass and create the tensor of predictions pred. The one-hot encoded tensor is named target and the cross entropy loss function is stored as criterion.

torch.optim as optim, and torch.nn as nn have already been loaded for you.

Instructions 1/2

undefined XP
    1
    2
  • Use optim to create an SGD optimizer with a learning rate of your choice (must be less than one) for the model provided.