BaşlayınÜcretsiz Başlayın

Building Confidence with LLM Sources

You're building a financial analysis assistant that needs to provide users with current stock market information. Since LLMs have knowledge cutoffs, you need to enable web search to access real-time data. Additionally, for transparency and credibility, you want to show users which sources were consulted during the search.

The OpenAI client has been initialized as client, and you'll be querying for the current price of Netflix stock.

Bu egzersiz

Working with the OpenAI Responses API

kursunun bir parçasıdır
Kursu Görüntüle

Egzersiz talimatları

  • Create a web search-enabled request, making sure to include the web search sources in the response.
  • Loop through the response items and extract only items with type "web_search_call", then print the .sources from each web search call's .action attribute.

Uygulamalı interaktif egzersiz

Bu örnek kodu tamamlayarak bu egzersizi bitirin.

# Create a response with web search enabled and sources included
response = client.responses.create(
    model="gpt-5-mini",
    tools=[{"type": "web_search"}],
    input="What is the current stock price of Netflix?",
    include=["web_search_call.____.____"]
)

# Extract and print sources from web search calls
for item in response.output:
    if ____:
        print(item.action.sources)
        
print(response.output_text)
Kodu Düzenle ve Çalıştır