LoslegenKostenlos loslegen

System messages

Your previous messages gave you a valid response, but it also allows users to ask any question they like, even if it doesn't relate to internet service support. In this exercise, you'll utilize a system message to steer the model into only answering customer questions about their internet service.

The Llama model is still available as llm.

Diese Übung ist Teil des Kurses

Working with Llama 3

Kurs anzeigen

Anleitung zur Übung

  • Add the "system" role to the message dictionary provided to the conv list.
  • Extract the model response "message" from the result object.

Interaktive Übung

Vervollständige den Beispielcode, um diese Übung erfolgreich abzuschließen.

# Add a system message to the conversation list
conv = [
	{
        "role": "____",
        "content": "You are a helpful and professional customer support assistant for an internet service provider. If the question or instruction doesn't relate to internet service, quote the response: 'Sorry, I can't answer that.'"},
	{
        "role": "user",
	    "content": "Help me decide which stocks to invest in."
    }
]

result = llm.create_chat_completion(messages=conv, max_tokens=15)
# Extract the model response from the result object
assistant_content = result["choices"][0]["____"]["content"]
print(assistant_content)
Code bearbeiten und ausführen