Session Ready
Exercise

Building the encoder

In the last lesson, you laid the groundwork for building a text auto-completion system by using the Enron email dataset. So far, you have converted the email dataset into a set of prefixes and suffixes and created the input and target vectors corresponding to them. Now that the dataset is pre-processed, it is time to build the encoder-decoder network. The input to the network is the set of prefixes and the goal is to generate the set of suffixes. The encoder will take the prefixes as input and summarize them in the internal state vectors which will be used in the decoder as an initial state.

You'll first build the encoder using Keras. The Input, LSTM and Dense layers are already imported from keras.layers. The vocabulary is saved in vocabulary.

Instructions
100 XP
  • Create the Input layer.
  • Create the LSTM layer of 256 units.
  • Pass input to the LSTM layer and get output.
  • Save encoder states.