Chat prompt templates
Given the importance of chat models in many LLM applications, LangChain provides functionality for creating prompt templates to structure messages to different chat roles.
The ChatPromptTemplate class has already been imported for you, and an LLM has already been defined.
Questo esercizio fa parte del corso
Developing LLM Applications with LangChain
Istruzioni dell'esercizio
- Use
ChatPromptTemplate.from_messages()to convert the role-message pairs into a chat prompt template. - Assign appropriate roles to the messages provided to create a conversation pattern.
- Create an LCEL chain and invoke it with the input provided.
Esercizio pratico interattivo
Prova a risolvere questo esercizio completando il codice di esempio.
llm = ChatOpenAI(model="gpt-4o-mini", api_key='')
# Create a chat prompt template
prompt_template = ChatPromptTemplate.____(
[
("____", "You are a geography expert that returns the colors present in a country's flag."),
("____", "France"),
("____", "blue, white, red"),
("____", "{country}")
]
)
# Chain the prompt template and model, and invoke the chain
llm_chain = ____ | llm
country = "Japan"
response = llm_chain.invoke({"country": country})
print(response.content)