BaşlayınÜcretsiz Başlayın

System messages

Your previous messages gave you a valid response, but it also allows users to ask any question they like, even if it doesn't relate to internet service support. In this exercise, you'll utilize a system message to steer the model into only answering customer questions about their internet service.

The Llama model is still available as llm.

Bu egzersiz

Working with Llama 3

kursunun bir parçasıdır
Kursu Görüntüle

Egzersiz talimatları

  • Add the "system" role to the message dictionary provided to the conv list.
  • Extract the model response "message" from the result object.

Uygulamalı interaktif egzersiz

Bu örnek kodu tamamlayarak bu egzersizi bitirin.

# Add a system message to the conversation list
conv = [
	{
        "role": "____",
        "content": "You are a helpful and professional customer support assistant for an internet service provider. If the question or instruction doesn't relate to internet service, quote the response: 'Sorry, I can't answer that.'"},
	{
        "role": "user",
	    "content": "Help me decide which stocks to invest in."
    }
]

result = llm.create_chat_completion(messages=conv, max_tokens=15)
# Extract the model response from the result object
assistant_content = result["choices"][0]["____"]["content"]
print(assistant_content)
Kodu Düzenle ve Çalıştır