Get startedGet started for free

Freeze layers of a model

You are about to fine-tune a model on a new task after loading pre-trained weights. The model contains three linear layers. However, because your dataset is small, you only want to train the last linear layer of this model and freeze the first two linear layers.

The model has already been created and exists under the variable model. You will be using the named_parameters method of the model to list the parameters of the model. Each parameter is described by a name. This name is a string with the following naming convention: x.name where x is the index of the layer.

Remember that a linear layer has two parameters: the weight and the bias.

This exercise is part of the course

Introduction to Deep Learning with PyTorch

View Course

Exercise instructions

  • Use an if statement to check if a parameter is a weight from the first or second layer.
  • Freeze the weights of the first two layers of this model.

Hands-on interactive exercise

Have a go at this exercise by completing this sample code.

for name, param in model.named_parameters():
  
    # Check for first layer's weight
    if name == '____':
   
        # Freeze this weight
        param.____ = ____
        
    # Check for second layer's weight
    if name == '____':
      
        # Freeze this weight
        param.____ = ____
Edit and Run Code