1. Learn
  2. /
  3. Courses
  4. /
  5. Introduction to Deep Learning in Python

Connected

Exercise

The Rectified Linear Activation Function

As Dan explained to you in the video, an "activation function" is a function applied at each node. It converts the node's input into some output.

The rectified linear activation function (called ReLU) has been shown to lead to very high-performance networks. This function takes a single number as an input, returning 0 if the input is negative, and the input if the input is positive.

Here are some examples:
relu(3) = 3
relu(-3) = 0

Instructions

100 XP
  • Fill in the definition of the relu() function:
    • Use the max() function to calculate the value for the output of relu().
  • Apply the relu() function to node_0_input to calculate node_0_output.
  • Apply the relu() function to node_1_input to calculate node_1_output.