Freeze layers of a model
You are about to fine-tune a model on a new task after loading pre-trained weights. The model contains three linear layers. However, because your dataset is small, you only want to train the last linear layer of this model and freeze the first two linear layers.
The model has already been created and exists under the variable model. You will be using the named_parameters method of the model to list the parameters of the model. Each parameter is described by a name. This name is a string with the following naming convention: x.name where x is the index of the layer.
Remember that a linear layer has two parameters: the weight and the bias.
Deze oefening maakt deel uit van de cursus
Introduction to Deep Learning with PyTorch
Oefeninstructies
- Use an
ifstatement to check if a parameter is a weight from the first or second layer. - Freeze the weights of the first two layers of this model.
Praktische interactieve oefening
Probeer deze oefening eens door deze voorbeeldcode in te vullen.
for name, param in model.named_parameters():
# Check for first layer's weight
if name == '____':
# Freeze this weight
param.____ = ____
# Check for second layer's weight
if name == '____':
# Freeze this weight
param.____ = ____