Session Ready
Exercise

Neural networks

Let us see the differences between neural networks which apply ReLU and those which do not apply ReLU. We have already initialized the input called input_layer, and three sets of weights, called weight_1, weight_2 and weight_3.

We are going to convince ourselves that networks with multiple layers which do not contain non-linearity can be expressed as neural networks with one layer.

The network and the shape of layers and weights is shown below.

Instructions
100 XP
  • Calculate the first and second hidden layer by multiplying the appropriate inputs with the corresponding weights.
  • Calculate and print the results of the output.
  • Set weight_composed_1 to the product of weight_1 with weight_2, then set weight to the product of weight_composed_1 with weight_3.
  • Calculate and print the output.