Exercise

# Multi-layer neural networks

In this exercise, you'll write code to do forward propagation for a neural network with 2 hidden layers. Each hidden layer has two nodes. The input data has been preloaded as `input_data`

. The nodes in the first hidden layer are called `node_0_0`

and `node_0_1`

. Their weights are pre-loaded as `weights['node_0_0']`

and `weights['node_0_1']`

respectively.

The nodes in the second hidden layer are called `node_1_0`

and `node_1_1`

. Their weights are pre-loaded as `weights['node_1_0']`

and `weights['node_1_1']`

respectively.

We then create a model output from the hidden nodes using weights pre-loaded as `weights['output']`

.

Instructions

**100 XP**

- Calculate
`node_0_0_input`

using its weights`weights['node_0_0']`

and the given`input_data`

. Then apply the`relu()`

function to get`node_0_0_output`

. - Do the same as above for
`node_0_1_input`

to get`node_0_1_output`

. - Calculate
`node_1_0_input`

using its weights`weights['node_1_0']`

and the outputs from the first hidden layer -`hidden_0_outputs`

. Then apply the`relu()`

function to get`node_1_0_output`

. - Do the same as above for
`node_1_1_input`

to get`node_1_1_output`

. - Calculate
`model_output`

using its weights`weights['output']`

and the outputs from the second hidden layer`hidden_1_outputs`

array. Do not apply the`relu()`

function to this output.