1. Học hỏi
  2. /
  3. Khoa Học
  4. /
  5. Introduction to Model Context Protocol (MCP)

Connected

Bài tập

Building the Message and Calling the LLM

With your get_context_from_mcp(user_query) helper function created to return the resource text and prompt text, it's time to pass that information to the LLM!

The currency server, get_context_from_mcp(), get_tools_from_mcp(), call_mcp_tool(), and the OpenAI client are set up in the background. You need to complete the function that builds the system content, calls the model, and handles either a direct message or a tool call. You've been provided with an ambiguous and an unambiguous user input to see if your MCP prompts made the difference!

Hướng dẫn

100 XP
  • On line 37, build full_prompt by concatenating prompt_text, the string "\n\nSupported currencies:\n", and resource_text.
  • On line 44, send full_prompt and the openai_tools list to the model.
  • On lines 53-55, if the output type is "message", return str(output.content[0].text).
  • On lines 58-60, if the output type is "function_call", pass the .name attribute of output and the arguments returned by json.loads(output.arguments) to call_mcp_tool().