System messages
Your previous messages gave you a valid response, but it also allows users to ask any question they like, even if it doesn't relate to internet service support. In this exercise, you'll utilize a system message to steer the model into only answering customer questions about their internet service.
The Llama model is still available as llm
.
This exercise is part of the course
Working with Llama 3
Exercise instructions
- Add the
"system"
role to the message dictionary provided to theconv
list. - Extract the model response
"message"
from theresult
object.
Hands-on interactive exercise
Have a go at this exercise by completing this sample code.
# Add a system message to the conversation list
conv = [
{
"role": "____",
"content": "You are a helpful and professional customer support assistant for an internet service provider. If the question or instruction doesn't relate to internet service, quote the response: 'Sorry, I can't answer that.'"},
{
"role": "user",
"content": "Help me decide which stocks to invest in."
}
]
result = llm.create_chat_completion(messages=conv, max_tokens=15)
# Extract the model response from the result object
assistant_content = result["choices"][0]["____"]["content"]
print(assistant_content)