Exercise

# Training a neural network

Given the fully connected neural network (called `model`

) which you built in the previous exercise and a train loader called `train_loader`

containing the `MNIST`

dataset (which we created for you), you're to train the net in order to predict the classes of digits. You will use the Adam optimizer to optimize the network, and considering that this is a classification problem you are going to use cross entropy as loss function.

Instructions

**100 XP**

- Instantiate the Adam optimizer with learning rate
`3e-4`

and instantiate Cross-Entropy as loss function. - Complete a forward pass on the neural network using the input
`data`

. - Using backpropagation, compute the gradients of the weights, and then change the weights using the
`Adam`

optimizer.