CommencerCommencer gratuitement

Text completion with pre-trained GPT-2 models

Back at PyBooks, your current project involves creating captivating narratives based on existing stories to engage customers and enhance their experience. To achieve this, you need a powerful text generation tool that can seamlessly generate compelling text continuations. You'll be using a pre-trained model to get the job done.

The following has been loaded for you: torch and GPT2Tokenizer,GPT2LMHeadModel from transformers.

Cet exercice fait partie du cours

Deep Learning for Text with PyTorch

Afficher le cours

Exercice interactif pratique

Essayez cet exercice en complétant cet exemple de code.

# Initialize the tokenizer
tokenizer = ____

# Initialize the pre-trained model
model = ____
Modifier et exécuter le code