IniziaInizia gratis

Defining the encoder

Here you'll be taking your first step towards creating a machine translation model: implementing the encoder. The encoder that you will implement is a very simple model compared to the complex models that are used in real-world applications such as the Google machine translation service. But don't worry, though the model is simple, the concepts are the same as of those complex models. Here we will use the prefix en (e.g. en_gru) to indicate anything encoder related and de to indicate decoder related things (e.g. de_gru).

You will see that we are choosing en_vocab to be smaller (150) than the actual value (228) that we found. Making the vocabulary smaller reduces the memory footprint of the model. Reducing the vocabulary slightly is fine as we are removing the rarest words when we are doing so. For machine translation tasks, rare words usually have less value than common words.

Questo esercizio fa parte del corso

Machine Translation with Keras

Visualizza il corso

Istruzioni dell'esercizio

  • Define an Input layer for an input which has a vocabulary size en_vocab and a sequence length en_len, using the shape argument.
  • Define a keras.layers.GRU layer that has hsize hidden units and returns its state.
  • Get the outputs from the GRU layer by feeding in en_inputs and assign the GRU state to en_state and the output to en_out.
  • Define a keras.models.Model whose input is en_inputs and the output is the en_state and print the model summary.

Esercizio pratico interattivo

Prova a risolvere questo esercizio completando il codice di esempio.

import tensorflow.keras as keras

en_len = 15
en_vocab = 150
hsize = 48

# Define an input layer
en_inputs = keras.layers.____(____=____)
# Define a GRU layer which returns the state
en_gru = ____(____, ____=____)
# Get the output and state from the GRU
____, ____ = ____(____)
# Define and print the model summary
encoder = ____(inputs=____, ____=____)
print(encoder.____)
Modifica ed esegui il codice