IniziaInizia gratis

Create a function to return an LLM response

Your chatbot now has plenty of tools available. It's still useful to invoke the LLM on its own when a question is unrelated to any of the tools that have been added to the chatbot. You'll now define a function that checks the last message in the conversation for any potential tool calls. If none are available, the chatbot will simply use the LLM to return an answer. To be able to handle both the user's queries as well as the chatbot's responses, the following modules have been imported for you to handle different types of messages.

from langchain_core.messages import AIMessage, HumanMessage

Questo esercizio fa parte del corso

Designing Agentic Systems with LangChain

Visualizza il corso

Istruzioni dell'esercizio

  • Access the last message from the state using "messages".
  • Write a conditional statement to check if the last_message is an AIMessage and that this message also features tool_calls.
  • If the condition is met, return the first "response" from tool_calls taken from the last_message in the content field of the AIMessage.
  • If the condition is not met, apply .invoke() to model_with_tools to generate a response, passing the full conversation history from state["messages"].

Esercizio pratico interattivo

Prova a risolvere questo esercizio completando il codice di esempio.

# Extract the last message from the history
def call_model(state: MessagesState):
    ____ = ____["____"][____]

    # If the last message has tool calls, return the tool's response
    if isinstance(____, ____) and ____.____:
        
        # Return only the messages from the tool call
        return {"messages": [____(content=____.____[0]["____"])]}
    
    # Otherwise, proceed with a regular LLM response
    return {"messages": [____.____(____["____"])]}
Modifica ed esegui il codice