Session Ready
Exercise

Build and compile RNN network

So far, you completed all the data preprocessing steps and have the input and target vectors ready. It is time to build the recurrent neural network. You'll create a small network architecture that will have 50 simple RNN nodes in the first layer followed by a dense layer. The dense layer will generate a probability distribution over the vocabulary for the next character. So, the size of the dense layer will be the same as the size of the vocabulary.

The dataset is available as the DataFrame names. The length of the longest name is saved in variable max_len. The vocabulary is available in variable vocabulary. The SimpleRNN, Dense, Activation, TimeDistributed layers are already imported from keras.layers and the Sequential model is already imported from keras.models.

Instructions
100 XP
  • Add a SimpleRNN layer of size 50 with return_sequences as True.
  • Add a time distributed Dense layer of size equal to the vocabulary.
  • Add a softmax activation layer.
  • Compile the model using "categorical_crossentropy" loss and "adam" optimizer.