Providing context through sample conversations
Suppose there is a delivery service named MyPersonalDelivery that offers a wide range of delivery options for various items. You want to create a customer service chatbot that supports customers with whatever they need. To accomplish this, you will provide a context_question and a context_answer about items the company delivers via previous conversations, and you will test if the model recognizes this context through a new user prompt.
The OpenAI package, the context_question and context_answer strings have been pre-loaded for you.
Deze oefening maakt deel uit van de cursus
Prompt Engineering with the OpenAI API
Oefeninstructies
- Define a
system_promptthat defines the purpose of the chatbot and guides it to answer queries in a gentle way. - Use the
system_prompt, thecontext_question, andcontext_answerto formulate a conversation that the chatbot can use as context in order to respond to the new user query.
Praktische interactieve oefening
Probeer deze oefening eens door deze voorbeeldcode in te vullen.
client = OpenAI(api_key="")
# Define the system prompt
system_prompt = "____"
context_question = "What types of items can be delivered using MyPersonalDelivery?"
context_answer = "We deliver everything from everyday essentials such as groceries, medications, and documents to larger items like electronics, clothing, and furniture. However, please note that we currently do not offer delivery for hazardous materials or extremely fragile items requiring special handling."
# Add the context to the model
response = client.chat.completions.create(
model="gpt-4o-mini",
messages=[{"role": "____", "content": ____},
{"role": "____", "content": ____},
{"role": "____", "content": ____ },
{"role": "user", "content": "Do you deliver furniture?"}])
response = response.choices[0].message.content
print(response)