Formatting prompts for Llama
Models can sometimes struggle to separate the task, expected output, and additional context from a long, unstructured prompt. To remedy this, you can insert clear labels to break up and differentiate this information for the model.
The Llama model is available as llm
, and will be available for the remainder of the course.
Diese Übung ist Teil des Kurses
Working with Llama 3
Anleitung zur Übung
- Add the labels
Instruction
,Question
, andAnswer
to the prompt to format it more effectively. - Pass the prompt to the model.
Interaktive Übung
Vervollständige den Beispielcode, um diese Übung erfolgreich abzuschließen.
# Add formatting to the prompt
prompt="""
____: Explain the concept of gravity in simple terms.
____: What is gravity?
____:
"""
# Send the prompt to the model
output = llm(____, max_tokens=15, stop=["Question:"])
print(output['choices'][0]['text'])