CommencerCommencer gratuitement

Query chatbot using Databricks Servings Endpoint API

In the previous exercise, you learned how to use the Databricks SDK to query an AI Foundation model. The goal of this exercise is for you to experiment with different user and system prompts when querying AI models using the Databricks SDK

Cet exercice fait partie du cours

Databricks with the Python SDK

Afficher le cours

Instructions

  • Edit the System prompt and see how the AI model responds differently.
  • Edit the User prompt and see how the AI model responds differently.

Exercice interactif pratique

Essayez cet exercice en complétant cet exemple de code.

from databricks.sdk import WorkspaceClient
from databricks.sdk.service.serving import ChatMessage, ChatMessageRole
w = WorkspaceClient()
response = w.serving_endpoints.query(
    name="databricks-meta-llama-3-3-70b-instruct",
    messages=[
        ChatMessage( # Edit the System prompt and see how the AI model responds differently
            role=ChatMessageRole.SYSTEM, content=""
        ),
        ChatMessage( # Edit the User prompt and see how the AI model responds differently
            role=ChatMessageRole.USER, content="" 
        ),
    ],
    max_tokens=128)
print(f"RESPONSE:\n{response.choices[0].message.content}")
Modifier et exécuter le code