Understanding the RepeatVector layer
You will now explore how the RepeatVector
layer works. The RepeatVector
layer adds an extra dimension to your dataset. For example if you have an input of shape (batch size, input size)
and you want to feed that to a GRU
layer, you can use a RepeatVector
layer to convert the input to a tensor with shape (batch size, sequence length, input size)
.
In this exercise, you will define a model that repeats a given input a fixed number of times. You will then feed a numpy
array to the model and investigate how the model changes the output.
This is a part of the course
“Machine Translation with Keras”
Exercise instructions
- Define a
RepeatVector
layer that repeats the input6
times. - Define a
Model
that takes the input layer in and produces the repeat vector output. - Define a
numpy
array object that has data[[0,1], [2,3]]
. - Predict the output of the model by feeding
x
as an input.
Hands-on interactive exercise
Have a go at this exercise by completing this sample code.
from tensorflow.keras.layers import Input, RepeatVector
from tensorflow.keras.models import Model
import numpy as np
inp = Input(shape=(2,))
# Define a RepeatVector that repeats the input 6 times
rep = ____(____)(inp)
# Define a model
model = ____(____=____, ____=____)
# Define input x
x = ____.____([____,____])
# Get model prediction y
y = ____.____(____)
print('x.shape = ',x.shape,'\ny.shape = ',y.shape)
This exercise is part of the course
Machine Translation with Keras
Are you curious about the inner workings of the models that are behind products like Google Translate?
In this chapter, you will implement the encoder-decoder model with the Keras functional API. While doing so, you will learn several useful Keras layers such as RepeatVector and TimeDistributed layers.
Exercise 1: Implementing the encoderExercise 2: Part 1: Exploring the datasetExercise 3: Part 2: Exploring the datasetExercise 4: Defining the encoderExercise 5: Implementing the decoderExercise 6: Understanding the RepeatVector layerExercise 7: The shape of a RepeatVector layer outputExercise 8: Defining the decoderExercise 9: Dense and TimeDistributed layersExercise 10: Part 1: Enter to win amazing prizesExercise 11: Part 2: Let's play a few more gamesExercise 12: Implementing the full encoder decoder modelExercise 13: Part 1: Defining the full modelExercise 14: Part 2: Defining the full modelWhat is DataCamp?
Learn the data skills you need online at your own pace—from non-coding essentials to data science and machine learning.