BaşlayınÜcretsiz Başlayın

Experimenting with learning rate

In this exercise, your goal is to find the optimal learning rate such that the optimizer can find the minimum of the non-convex function \(x^{4} + x^{3} - 5x^{2}\) in ten steps.

You will experiment with three different learning rate values. For this problem, try learning rate values between 0.001 to 0.1.

You are provided with the optimize_and_plot() function that takes the learning rate for the first argument. This function will run 10 steps of the SGD optimizer and display the results.

Bu egzersiz

Introduction to Deep Learning with PyTorch

kursunun bir parçasıdır
Kursu Görüntüle

Uygulamalı interaktif egzersiz

Bu örnek kodu tamamlayarak bu egzersizi bitirin.

# Try a first learning rate value
lr0 = ____
optimize_and_plot(lr=lr0)
Kodu Düzenle ve Çalıştır