MulaiMulai sekarang secara gratis

Guiding customer support responses

You work for an e-commerce company and are integrating Llama into a customer support assistant. The assistant answers frequently asked questions, but you've noticed that responses are too repetitive.

You need to modify decoding parameters to encourage more varied wording while keeping responses informative.

The model is already instantiated with a model using llama_cpp and is stored in llm.

Latihan ini adalah bagian dari kursus

Working with Llama 3

Lihat Kursus

Petunjuk latihan

  • Set the temperature parameter so that responses are less repetitive and more dynamic.

Latihan interaktif praktis

Cobalah latihan ini dengan menyelesaikan kode contoh berikut.

output = llm(
		"Can I exchange an item I purchased?", 
  		# Set the temperature parameter to provide more varied responses
		temperature=____,
        max_tokens=15
) 

print(output['choices'][0]['text'])
Edit dan Jalankan Kode