Query chatbot using Databricks Servings Endpoint API
In the previous exercise, you learned how to use the Databricks SDK to query an AI Foundation model. The goal of this exercise is for you to experiment with different user and system prompts when querying AI models using the Databricks SDK
Diese Übung ist Teil des Kurses
Databricks with the Python SDK
Anleitung zur Übung
- Edit the System prompt and see how the AI model responds differently.
- Edit the User prompt and see how the AI model responds differently.
Interaktive Übung
Vervollständige den Beispielcode, um diese Übung erfolgreich abzuschließen.
from databricks.sdk import WorkspaceClient
from databricks.sdk.service.serving import ChatMessage, ChatMessageRole
w = WorkspaceClient()
response = w.serving_endpoints.query(
name="databricks-meta-llama-3-3-70b-instruct",
messages=[
ChatMessage( # Edit the System prompt and see how the AI model responds differently
role=ChatMessageRole.SYSTEM, content=""
),
ChatMessage( # Edit the User prompt and see how the AI model responds differently
role=ChatMessageRole.USER, content=""
),
],
max_tokens=128)
print(f"RESPONSE:\n{response.choices[0].message.content}")