Get startedGet started for free

Compiling and fitting a model

1. Compiling and fitting a model

After you've specified a model, the next task is to compile it, which sets up the network for optimization, for instance creating an internal function to do back-propagation efficiently. The compile methods

2. Why you need to compile your model

has two important arguments for you to choose. The first is what optimizer to use, which controls the learning rate. In practice, the right choice of learning rate can make a big difference for how quickly our model finds good weights, and even how good a set of weights it can find. There are a few algorithms that automatically tune the learning rate. Even many experts in the field don't know all the details of all the optimization algorithms. So the pragmatic approach is to choose a versatile algorithm and use that for most problems. Adam is an excellent choice as your go-to optimizer. Adam adjusts the learning rate as it does gradient descent, to ensure reasonable values throughout the weight optimization process. The second thing you specify is the loss function. Mean squared error is the most common choice for regression problems. When we use Keras for classification, you will learn a new default metric.

3. Compiling a model

Here is an example of the code to compile a model. It builds a model, as you've already seen, and then we add a compile command after building the model. After compiling the model, you can

4. What is fitting a model

fit it. That is applying back-propagation and gradient descent with your data to update the weights. The fit step looks similar to what you've seen in scikit-learn, though it has more options which we will explore soon. Even with the Adam optimizer, which is pretty smart, it can improve your optimization process if you scale all the data so each feature is, on average, about similar sized values. One common approach is to subtract each feature by that features mean, and divide it by it's standard deviation.

5. Fitting a model

You can see what the code looks like here. After the compile step, we run fit, with the predictors as the first argument. When you run this, you will see some output showing the optimizations progress as it fits the data. We'll go into more detail about this output soon, but for now,

6. Let's practice!

just think of it as a log showing model performance on the training data as we update model weights.

Create Your Free Account

or

By continuing, you accept our Terms of Use, our Privacy Policy and that your data is stored in the USA.