Get startedGet started for free

Create a function to return an LLM response

Your chatbot now has plenty of tools available. It's still useful to invoke the LLM on its own when a question is unrelated to any of the tools that have been added to the chatbot. You'll now define a function that checks the last message in the conversation for any potential tool calls. If none are available, the chatbot will simply use the LLM to return an answer. To be able to handle both the user's queries as well as the chatbot's responses, the following modules have been imported for you to handle different types of messages.

from langchain_core.messages import AIMessage, HumanMessage

This exercise is part of the course

Designing Agentic Systems with LangChain

View Course

Exercise instructions

  • Access the last message from the state using "messages".
  • Write a conditional statement to check if the last_message is an AIMessage and that this message also features tool_calls.
  • If the condition is met, return the first "response" from tool_calls taken from the last_message in the content field of the AIMessage.
  • If the condition is not met, apply .invoke() to model_with_tools to generate a response, passing the full conversation history from state["messages"].

Hands-on interactive exercise

Have a go at this exercise by completing this sample code.

# Extract the last message from the history
def call_model(state: MessagesState):
    ____ = ____["____"][____]

    # If the last message has tool calls, return the tool's response
    if isinstance(____, ____) and ____.____:
        
        # Return only the messages from the tool call
        return {"messages": [____(content=____.____[0]["____"])]}
    
    # Otherwise, proceed with a regular LLM response
    return {"messages": [____.____(____["____"])]}
Edit and Run Code