IniziaInizia gratis

Implementing few-shot prompting

Time to combine your components together into a chain! The few-shot prompt you created in the previous exercise is still available for you to use, along with examples and example_prompt.

All of the LangChain classes necessary for completing this exercise have been pre-loaded for you.

Questo esercizio fa parte del corso

Developing LLM Applications with LangChain

Visualizza il corso

Istruzioni dell'esercizio

  • Instantiate an OpenAI chat LLM using the ChatOpenAI class.
  • Create a chain from the prompt template and LLM using the | operator, then invoke it using the .invoke() method.

Esercizio pratico interattivo

Prova a risolvere questo esercizio completando il codice di esempio.

prompt_template = FewShotPromptTemplate(
    examples=examples,
    example_prompt=example_prompt,
    suffix="Question: {input}",
    input_variables=["input"],
)

# Create an OpenAI chat LLM
llm = ____(model="gpt-4o-mini", api_key='')

# Create and invoke the chain
llm_chain = ____
print(____({"input": "What is Jack's favorite technology on DataCamp?"}))
Modifica ed esegui il codice