Query chatbot using Databricks Servings Endpoint API
In the previous exercise, you learned how to use the Databricks SDK to query an AI Foundation model. The goal of this exercise is for you to experiment with different user and system prompts when querying AI models using the Databricks SDK
This exercise is part of the course
Databricks with the Python SDK
Exercise instructions
- Edit the System prompt and see how the AI model responds differently.
- Edit the User prompt and see how the AI model responds differently.
Hands-on interactive exercise
Have a go at this exercise by completing this sample code.
from databricks.sdk import WorkspaceClient
from databricks.sdk.service.serving import ChatMessage, ChatMessageRole
w = WorkspaceClient()
response = w.serving_endpoints.query(
name="databricks-meta-llama-3-3-70b-instruct",
messages=[
ChatMessage( # Edit the System prompt and see how the AI model responds differently
role=ChatMessageRole.SYSTEM, content=""
),
ChatMessage( # Edit the User prompt and see how the AI model responds differently
role=ChatMessageRole.USER, content=""
),
],
max_tokens=128)
print(f"RESPONSE:\n{response.choices[0].message.content}")