Session Ready
Exercise

ReLU activation

In this exercise, we have the same settings as the previous exercise. In addition, we have instantiated the ReLU activation function called relu().

Now we are going to build a neural network which has non-linearity and by doing so, we are going to convince ourselves that networks with multiple layers and non-linearity functions cannot be expressed as a neural network with one layer.

Instructions
100 XP
  • Apply non-linearity on hidden_1 and hidden_2.
  • Apply non-linearity in the product of first two weight.
  • Multiply the result of the previous step with weight_3.
  • Multiply input_layer with weight and print the results.