Exercise

ReLU activation

In this exercise, we have the same settings as the previous exercise. But now we are going to build a neural network which has non-linearity. By doing so, we are going to convince ourselves that networks with multiple layers and non-linearity functions cannot be expressed as a neural network with one layer.

We have already instantiated the ReLU activation function called relu() for you.

Instructions

100 XP
  • Apply the non-linearity to the two hidden layers and print the result.
  • Apply the non-linearity to the product of first two weights.
  • Multiply the result of the previous step with weight_3.
  • Multiply input_layer with weight and print the results.