Freeze layers of a model
You are about to fine-tune a model on a new task after loading pre-trained weights. The model contains three linear layers. However, because your dataset is small, you only want to train the last linear layer of this model and freeze the first two linear layers.
The model has already been created and exists under the variable model
. You will be using the named_parameters
method of the model to list the parameters of the model. Each parameter is described by a name. This name is a string with the following naming convention: x.name
where x
is the index of the layer.
Remember that a linear layer has two parameters: the weight
and the bias
.
This exercise is part of the course
Introduction to Deep Learning with PyTorch
Exercise instructions
- Use an
if
statement to determine if the parameter should be frozen or not based on its name. - Freeze the parameters of the first two layers of this model.
Hands-on interactive exercise
Have a go at this exercise by completing this sample code.
for name, param in model.named_parameters():
# Check if the parameters belong to the first layer
if name == '____':
# Freeze the parameters
____.____ = ____
# Check if the parameters belong to the second layer
if name == '____':
# Freeze the parameters
____.____ = ____