Get startedGet started for free

Build your LSTM model

You've already prepared your sequences of text. It's time to build your LSTM model!

Remember your sequences had 4 words each, your model will be trained on the first three words of each sequence, predicting the 4th one. You are going to use an Embedding layer that will essentially learn to turn words into meaningful vectors. These vectors will then be passed to a simple LSTM layer. Our output is a Dense layer with as many neurons as words in the vocabulary and softmax activation. This is because we want to obtain the highest probable next word out of all possible words.

The size of the vocabulary of words (the unique number of words) is stored in vocab_size.

This exercise is part of the course

Introduction to Deep Learning with Keras

View Course

Exercise instructions

  • Import the Embedding, LSTM and Dense layer from tensorflow.keras layers.
  • Add an Embedding() layer of the vocabulary size, that will turn words into 8 number vectors and receive sequences of length 3.
  • Add a 32 neuron LSTM() layer.
  • Add a hidden Dense() layer of 32 neurons and an output layer of vocab_size neurons with softmax.

Hands-on interactive exercise

Have a go at this exercise by completing this sample code.

# Import the Embedding, LSTM and Dense layer
from tensorflow.keras.____ import ____, ____, ____

model = Sequential()

# Add an Embedding layer with the right parameters
model.add(____(input_dim = ____, input_length = ____, output_dim = ____, ))

# Add a 32 unit LSTM layer
model.add(____(____))

# Add a hidden Dense layer of 32 units and an output layer of vocab_size with softmax
model.add(Dense(____, activation='relu'))
model.add(Dense(____, activation=____))
model.summary()
Edit and Run Code