Aan de slagGa gratis aan de slag

Loading and inspecting pre-trained models

You're building a conversational AI assistant that can engage in human-like dialog across a wide range of topics, leveraging the powerful BERT model that has been pre-trained on a large corpus of text data.

You'll print the configuration to verify that you've loaded a conversational AI model with certain parameters like model_type: bert, num_attention_heads: 12, and num_hidden_layers: 12.

Deze oefening maakt deel uit van de cursus

Efficient AI Model Training with PyTorch

Cursus bekijken

Oefeninstructies

  • Initialize the model parameters with the appropriate AutoModel class to load the bert-base-uncased model.
  • Print the model's configuration.

Praktische interactieve oefening

Probeer deze oefening eens door deze voorbeeldcode in te vullen.

from transformers import AutoModelForSequenceClassification

# Load a pre-trained bert-base-uncased model
model = ____.____("bert-base-uncased")

# Print the model's configuration
print(model.____)
Code bewerken en uitvoeren