LoslegenKostenlos loslegen

Text generation examples

In this exercise, you are going to experiment on two pre-trained models for text generation.

The first model will generate one phrase based on the character Sheldon of The Big Bang Theory TV show, and the second model will generate a Shakespeare poems up to 400 characters.

The models are loaded on the sheldon_model and poem_model variables. Also, two custom functions to help generate text are available: generate_sheldon_phrase() and generate_poem(). Both receive the pre-trained model and a context string as parameters.

Diese Übung ist Teil des Kurses

Recurrent Neural Networks (RNNs) for Language Modeling with Keras

Kurs anzeigen

Anleitung zur Übung

  • Use pre-defined function generate_sheldon_phrase() with parameters sheldon_model and sheldon_context and store the output in the sheldon_phrase variable.
  • Print the obtained phrase.
  • Store the given text into the poem_context variable.
  • Print the poem generated by applying the function generate_poem() with the poem_model and poem_context parameters.

Interaktive Übung

Vervollständige den Beispielcode, um diese Übung erfolgreich abzuschließen.

# Context for Sheldon phrase
sheldon_context = "I’m not insane, my mother had me tested. "

# Generate one Sheldon phrase
sheldon_phrase = ____(sheldon_model, sheldon_context)

# Print the phrase
print(____)

# Context for poem
____ = "May thy beauty forever remain"

# Print the poem
print(generate_poem(____, poem_context))
Code bearbeiten und ausführen