Create a function to return an LLM response
Your chatbot now has plenty of tools available. It's still useful to invoke the LLM on its own when a question is unrelated to any of the tools that have been added to the chatbot. You'll now define a function that checks the last message in the conversation for any potential tool calls. If none are available, the chatbot will simply use the LLM to return an answer. To be able to handle both the user's queries as well as the chatbot's responses, the following modules have been imported for you to handle different types of messages.
from langchain_core.messages import AIMessage, HumanMessage
Deze oefening maakt deel uit van de cursus
Designing Agentic Systems with LangChain
Oefeninstructies
- Access the last message from the
stateusing"messages". - Write a conditional statement to check if the
last_messageis anAIMessageand that this message also featurestool_calls. - If the condition is met, return the first
"response"fromtool_callstaken from thelast_messagein thecontentfield of theAIMessage. - If the condition is not met, apply
.invoke()tomodel_with_toolsto generate a response, passing the full conversation history fromstate["messages"].
Praktische interactieve oefening
Probeer deze oefening eens door deze voorbeeldcode in te vullen.
# Extract the last message from the history
def call_model(state: MessagesState):
____ = ____["____"][____]
# If the last message has tool calls, return the tool's response
if isinstance(____, ____) and ____.____:
# Return only the messages from the tool call
return {"messages": [____(content=____.____[0]["____"])]}
# Otherwise, proceed with a regular LLM response
return {"messages": [____.____(____["____"])]}