BaşlayınÜcretsiz Başlayın

Batch normalization effects

Batch normalization tends to increase the learning speed of our models and make their learning curves more stable. Let's see how two identical models with and without batch normalization compare.

The model you just built batchnorm_model is loaded for you to use. An exact copy of it without batch normalization: standard_model, is available as well. You can check their summary() in the console. X_train, y_train, X_test, and y_test are also loaded so that you can train both models.

You will compare the accuracy learning curves for both models plotting them with compare_histories_acc().

You can check the function pasting show_code(compare_histories_acc) in the console.

Bu egzersiz

Introduction to Deep Learning with Keras

kursunun bir parçasıdır
Kursu Görüntüle

Egzersiz talimatları

  • Train the standard_model for 10 epochs passing in train and validation data, storing its history in h1_callback.
  • Train your batchnorm_model for 10 epochs passing in train and validation data, storing its history in h2_callback.
  • Call compare_histories_acc passing in h1_callback and h2_callback.

Uygulamalı interaktif egzersiz

Bu örnek kodu tamamlayarak bu egzersizi bitirin.

# Train your standard model, storing its history callback
h1_callback = standard_model.fit(____, ____, validation_data=(____,____), epochs=____, verbose=0)

# Train the batch normalized model you recently built, store its history callback
h2_callback = batchnorm_model.fit(____, ____, validation_data=____, epochs=____, verbose=0)

# Call compare_histories_acc passing in both model histories
compare_histories_acc(____, ____)
Kodu Düzenle ve Çalıştır