Exercise

Defining the Teacher Forcing model layers

You will be defining a new-and-improved version of the machine translation model that you defined earlier. Did you know that models like the Google Machine Translator used this Teacher Forcing technique to train their model?

As you have already seen, your previous model needs to change slightly to adopt Teacher Forcing. In this exercise, you will make the necessary changes to the previous model. You have been provided with the language parameters en_len and fr_len (length of a padded English/French sentences), en_vocab and fr_vocab (Vocabulary size of English/French datasets) and hsize (the hidden layer size of the GRU layers). Remember that the decoder will accept a French sequence with one item less than fr_len. Remember that we use the prefix en to refer to encoder related things and de for decoder related things.

Instructions

100 XP
  • Import the layers submodule from tensorflow.keras.
  • Get the encoder output and state values and assign them to en_out and en_state respectively.
  • Define a decoder Input layer which accepts a fr_len-1 long sequence of onehot encoded French words.
  • Define a TimeDistributed Dense softmax layer with fr_vocab nodes.