Session Ready
Exercise

Multiple hyperparameter tuning

Now that you've successfully shown your ability to use hyperparameter tuning on the Banknote_Authenication dataset, it's time to explore whether adding another hyperparameter to your tuning run on the model will improve the outcome.

Note that, when using the canned dnn_classifier, the activation function defaults to relu. This method has the advantage of not throwing backpropogation errors and builds quickly compared to other models. Another activation function, softmax, gives a probability output and is best for recurrent networks and probabilistic models. Which is a better choice for the current model? Use tfruns to find out!

Instructions 1/2
undefined XP
  • 1
  • 2

Question

If we create flags with dropout rates of 0.2, 0.3, 0.4 and two activation functions (relu and softmax), how many tuning runs will be undertaken?

Possible Answers