Guiding customer support responses
You work for an e-commerce company and are integrating Llama into a customer support assistant. The assistant answers frequently asked questions, but you've noticed that responses are too repetitive.
You need to modify decoding parameters to encourage more varied wording while keeping responses informative.
The model is already instantiated with a model using llama_cpp
and is stored in llm
.
Diese Übung ist Teil des Kurses
Working with Llama 3
Anleitung zur Übung
- Set the
temperature
parameter so that responses are less repetitive and more dynamic.
Interaktive Übung
Vervollständige den Beispielcode, um diese Übung erfolgreich abzuschließen.
output = llm(
"Can I exchange an item I purchased?",
# Set the temperature parameter to provide more varied responses
temperature=____,
max_tokens=15
)
print(output['choices'][0]['text'])