LoslegenKostenlos loslegen

Text completion with pre-trained GPT-2 models

Back at PyBooks, your current project involves creating captivating narratives based on existing stories to engage customers and enhance their experience. To achieve this, you need a powerful text generation tool that can seamlessly generate compelling text continuations. You'll be using a pre-trained model to get the job done.

The following has been loaded for you: torch and GPT2Tokenizer,GPT2LMHeadModel from transformers.

Diese Übung ist Teil des Kurses

Deep Learning for Text with PyTorch

Kurs anzeigen

Interaktive Übung

Vervollständige den Beispielcode, um diese Übung erfolgreich abzuschließen.

# Initialize the tokenizer
tokenizer = ____

# Initialize the pre-trained model
model = ____
Code bearbeiten und ausführen