LoslegenKostenlos loslegen

Chatting with Llama 3

A data analytics company is developing an internal AI assistant to help employees quickly access technical information. As part of this project, they want to integrate Llama 3, and have asked you to test it.

To get started, you initialize the Llama 3 model locally and generate a response to a question.

The model path is already stored as a variable named llama_path.

Diese Übung ist Teil des Kurses

Working with Llama 3

Kurs anzeigen

Anleitung zur Übung

  • Import the Llama class from llama_cpp.
  • Initialize an instance of Llama, specifying llama_path as the path to the model file.

Interaktive Übung

Vervollständige den Beispielcode, um diese Übung erfolgreich abzuschließen.

# Import the Llama class
from llama_cpp import ____

# Initialize the Llama model
llm = ____(model_path=____)

question = "What is the most used database for data storage?"
response = llm(question)
print(response)
Code bearbeiten und ausführen