Aan de slagGa gratis aan de slag

Text completion with pre-trained GPT-2 models

Back at PyBooks, your current project involves creating captivating narratives based on existing stories to engage customers and enhance their experience. To achieve this, you need a powerful text generation tool that can seamlessly generate compelling text continuations. You'll be using a pre-trained model to get the job done.

The following has been loaded for you: torch and GPT2Tokenizer,GPT2LMHeadModel from transformers.

Deze oefening maakt deel uit van de cursus

Deep Learning for Text with PyTorch

Cursus bekijken

Praktische interactieve oefening

Probeer deze oefening eens door deze voorbeeldcode in te vullen.

# Initialize the tokenizer
tokenizer = ____

# Initialize the pre-trained model
model = ____
Code bewerken en uitvoeren