CommencerCommencer gratuitement

Formatting prompts for Llama

Models can sometimes struggle to separate the task, expected output, and additional context from a long, unstructured prompt. To remedy this, you can insert clear labels to break up and differentiate this information for the model.

The Llama model is available as llm, and will be available for the remainder of the course.

Cet exercice fait partie du cours

Working with Llama 3

Afficher le cours

Instructions

  • Add the labels Instruction, Question, and Answer to the prompt to format it more effectively.
  • Pass the prompt to the model.

Exercice interactif pratique

Essayez cet exercice en complétant cet exemple de code.

# Add formatting to the prompt
prompt="""
____: Explain the concept of gravity in simple terms.
____: What is gravity?
____:
"""

# Send the prompt to the model
output = llm(____, max_tokens=15, stop=["Question:"]) 
print(output['choices'][0]['text'])
Modifier et exécuter le code