Get startedGet started for free

Using the PyTorch optimizer

Earlier, you manually updated the weight of a network, gaining insight into how training works behind the scenes. However, this method isn’t scalable for deep networks with many layers.

Thankfully, PyTorch provides the SGD optimizer, which automates this process efficiently in just a few lines of code. Now, you’ll complete the training loop by updating the weights using a PyTorch optimizer.

A neural network has been created and provided as the model variable. This model was used to run a forward pass and create the tensor of predictions pred. The one-hot encoded tensor is named target and the cross entropy loss function is stored as criterion.

torch.optim as optim, and torch.nn as nn have already been loaded for you.

This exercise is part of the course

Introduction to Deep Learning with PyTorch

View Course

Hands-on interactive exercise

Have a go at this exercise by completing this sample code.

# Create the optimizer
optimizer = ____
Edit and Run Code