1. Neural Networks
Now we'll cover neural networks, and how to use them in Python.
2. Neural networks
Neural network use has rapidly grown, due to GPU computation power, software improvements, and popularity. This plot shows the exponential growth of GPU computational power over time.
3. Neural networks have potential
Neural nets are similar to other models we've used; we supply features and targets to get predictions. However, neural nets have potential to outperform other models because they have non-linearity, capture variable interactions, and are highly customizable. Neural nets were inspired by the functionality of the human brain.
4. Neural network diagram
Here's a simple neural network; each row is a layer of neurons. Each neuron connects to all neurons in the next layer, so these are called dense layers.
5. Neural network diagram
The neurons, or circles, represent math. We send input data into the first neurons, multiply by weights, and add a bias.
6. Neural network math
We represent these operations with matrix math, or linear algebra, which looks like this. This is why GPUs can be used to speed up calculations.
7. Neural network activations
Each layer has an activation. After data has gone through layers, we apply an activation function to add non-linearity.
8. ReLU
We'll use a common activation, which is ReLU -- rectified linear units. This is 0 for negative numbers, and linear for positive numbers.
9. Neural network loss function
Once we have predictions, we use loss functions to compare our predictions and targets. For regression we often use mean squared error.
10. How neural nets learn
Predictions are made by passing data through in the forward direction. At the end of the neural network, we have a single node which yields our predictions.
11. How neural nets learn
Next, we use the error from our loss function and pass this backwards through the network. This updates weights and biases so our predictions are closer to the truth and are called backpropagation. It involves taking derivatives of the forward-direction math equations.
Backpropagation helps neural nets learn more effectively, and is why standardizing our data is important.
12. Implementing a neural net with keras
We'll use the keras library with the TensorFlow backend to implement neural networks. keras is a high-level API that allows us to design neural nets with minimal code but allows for a lot of customization.
13. Implementing a neural net with keras
In keras, we can use the sequential or functional API. We'll stick with sequential for now because it's simpler. We first import the sequential class from keras-dot-models and the dense layer we'll use.
14. Implementing a neural net with keras
We create the model with the Sequential() class. We then add layers by using the dot-add() function. For the first layer we use 50 nodes, and specify input_dim as the number of features from our features shape. We add another layer with 10 nodes, and use ReLU for the activations. Our last layer is one node, and is linear for regression.
15. Fitting the model
To fit the model, we compile it with an optimizer and loss function. The optimizer effects how fast the net learns. We'll use adam because it often works well, although there are many others like RMSprop. The loss function mse is mean squared error. Next, we fit the model with features and targets and specify the number of epochs, which is the number of training cycles.
16. Examining the loss
We want to look at the loss versus epochs after training to ensure the loss has flattened out. Normally we also want to split off some of the training data into a validation set to make sure we're not overfitting, but we won't cover that here.
17. Checking out performance
Finally, we'll check out performance with R-squared and plot the predictions versus actual values. keras models don't have a dot-score() function, so we'll use sklearn's r2_score() to calculate R-squared.
18. Plot performance
Plotting predictions versus actual values is the same as we've done before with matplotlib.
19. Make a neural net!
Ok, you're ready -- go make a neural net!