BaşlayınÜcretsiz Başlayın

Hugging Face models in LangChain!

There are thousands of models freely available to download and use on Hugging Face. Hugging Face integrates really nicely into LangChain via its partner library, langchain-huggingface, which is available for you to use.

In this exercise, you'll load and call the crumb/nano-mistral model from Hugging Face. This is a ultra-light LLM designed to be fine-tuned for greater performance.

Bu egzersiz

Developing LLM Applications with LangChain

kursunun bir parçasıdır
Kursu Görüntüle

Egzersiz talimatları

  • Import HuggingFacePipeline from langchain_huggingface to work with Hugging Face models.
  • Define a text generation LLM by calling HuggingFacePipeline.from_model_id().
  • Set the model_id parameter to specify which Hugging Face model to use.

Uygulamalı interaktif egzersiz

Bu örnek kodu tamamlayarak bu egzersizi bitirin.

# Import the HuggingFacePipeline class for defining Hugging Face pipelines
from langchain_huggingface import ____

# Define the LLM from the Hugging Face model ID
llm = ____.from_model_id(
    ____="crumb/nano-mistral",
    task="text-generation",
    pipeline_kwargs={"max_new_tokens": 20}
)

prompt = "Hugging Face is"

# Invoke the model
response = llm.invoke(prompt)
print(response)
Kodu Düzenle ve Çalıştır