Create a function to return an LLM response
Your chatbot now has plenty of tools available. It's still useful to invoke the LLM on its own when a question is unrelated to any of the tools that have been added to the chatbot. You'll now define a function that checks the last message in the conversation for any potential tool calls. If none are available, the chatbot will simply use the LLM to return an answer. To be able to handle both the user's queries as well as the chatbot's responses, the following modules have been imported for you to handle different types of messages.
from langchain_core.messages import AIMessage, HumanMessage
This exercise is part of the course
Designing Agentic Systems with LangChain
Exercise instructions
- Access the last message from the
state
using"messages"
. - Write a conditional statement to check if the
last_message
is anAIMessage
and that this message also featurestool_calls
. - If the condition is met, return the first
"response"
fromtool_calls
taken from thelast_message
in thecontent
field of theAIMessage
. - If the condition is not met, apply
.invoke()
tomodel_with_tools
to generate a response, passing the full conversation history fromstate["messages"]
.
Hands-on interactive exercise
Have a go at this exercise by completing this sample code.
# Extract the last message from the history
def call_model(state: MessagesState):
____ = ____["____"][____]
# If the last message has tool calls, return the tool's response
if isinstance(____, ____) and ____.____:
# Return only the messages from the tool call
return {"messages": [____(content=____.____[0]["____"])]}
# Otherwise, proceed with a regular LLM response
return {"messages": [____.____(____["____"])]}