Exercise

ReLU activation again

Neural networks don't need to have the same number of units in each layer. Here, you are going to experiment with the ReLU activation function again, but this time we are going to have a different number of units in the layers of the neural network. The input layer will still have 4 features, but then the first hidden layer will have 6 units and the output layer will have 2 units.

Instructions

100 XP
  • Instantiate the ReLU() activation function as relu (the function is part of nn module).
  • Initialize weight_1 and weight_2 with random numbers.
  • Multiply the input_layer with weight_1, storing results in hidden_1.
  • Apply the relu activation function over hidden_1, and then multiply the output of it with weight_2.