Enable multi-turn conversation with memory
You are just about ready to share your chatbot update with the school administration! For students to have a smooth learning experience, it's important to enable them to ask follow-up questions. This way, if any information is missing from the chatbot's first answer, the students can modify their questions by having a conversation. You'll now adapt your chatbot's streaming function to enable multiple turns, printing both the user's query as well as the chatbot's answer. To enable memory, LangGraph will send the full conversation to the LLM when follow-up questions are asked. To start with, your config
parameters have already been set for one user:
config = {"configurable": {"thread_id": "1"}}
This exercise is part of the course
Designing Agentic Systems with LangChain
Exercise instructions
- For each turn, start by printing the user's
query
from thequeries
list. - Iterate through
msg
andmetadata
usingapp.stream()
, passingquery
as thecontent
ofHumanMessage
along withconfig
, and join themsg.content
values. - To extract the chatbot's responses, print
msg.content
while excluding anymsg
labeledHumanMessage
, adding a new line before the next query.
Hands-on interactive exercise
Have a go at this exercise by completing this sample code.
# Print the user query first for every interaction
def user_agent_multiturn(queries):
for ____ in ____:
print(f"User: {____}")
# Stream through messages corresponding to queries, excluding metadata
print("Agent: " + "".join(____.____ for ____, ____ in app.____(
{"messages": [____(____=_____)]}, config, stream_mode="messages")
# Filter out the human messages to print agent messages
if ____.____ and not isinstance(____, ____)) + "____")
queries = ["Is `stressed desserts?` a palindrome?", "What about the word `kayak`?",
"What happened on the May 8th, 1945?", "What about 9 November 1989?"]
user_agent_multiturn(queries)