Get startedGet started for free

Providing context through sample conversations

Suppose there is a delivery service named MyPersonalDelivery that offers a wide range of delivery options for various items. You want to create a customer service chatbot that supports customers with whatever they need. To accomplish this, you will provide a context_question and a context_answer about items the company delivers via previous conversations, and you will test if the model recognizes this context through a new user prompt.

The OpenAI package, the context_question and context_answer strings have been pre-loaded for you.

This exercise is part of the course

ChatGPT Prompt Engineering for Developers

View Course

Exercise instructions

  • Define a system_prompt that defines the purpose of the chatbot and guides it to answer queries in a gentle way.
  • Use the system_prompt, the context_question, and context_answer to formulate a conversation that the chatbot can use as context in order to respond to the new user query.

Hands-on interactive exercise

Have a go at this exercise by completing this sample code.

client = OpenAI(api_key="")

# Define the system prompt
system_prompt = "____"

context_question = "What types of items can be delivered using MyPersonalDelivery?"
context_answer = "We deliver everything from everyday essentials such as groceries, medications, and documents to larger items like electronics, clothing, and furniture. However, please note that we currently do not offer delivery for hazardous materials or extremely fragile items requiring special handling."

# Add the context to the model
response = client.chat.completions.create(
  model="gpt-4o-mini",
  messages=[{"role": "____", "content": ____},
            {"role": "____", "content": ____},
            {"role": "____", "content": ____ },
            {"role": "user", "content": "Do you deliver furniture?"}])
response = response.choices[0].message.content
print(response)
Edit and Run Code