1. Shared layers
In this chapter, you will create a model with two inputs: one for each team in the basketball dataset.
However, you want these two teams to each use the same embedding layer you defined in the previous lesson.
Accomplishing this requires a shared layer.
2. Shared layers
Shared layers are an advanced deep learning concept, and are only possible with the Keras functional API. They allow you to define an operation and then apply the exact same operation (with the exact same weights) on different inputs.
In this model, we will share team rating for both inputs. The learned rating will be the same, whether it applies to team 1 or team 2.
3. Shared layers
To create a shared layer, you must first create two (or more) inputs, each of which will be passed to the shared layer.
In this case, you will use two inputs.
4. Shared layers
Once you have two inputs, the magic of the Keras functional API becomes apparent.
Recall from chapter 1 that the Dense() function returns a function as its output. This function, which Dense() outputs, takes a tensor as input and produces a tensor as output.
You can use the same Dense() function to create a shared layer!
Doing so is as simple as calling the function twice, with a different input tensor each time.
5. Sharing multiple layers as a model
Recall the category embedding model we made in the previous lesson. This model first embeds an input and then flattens it.
You can also share models, not just layers. This is really cool and is part of what makes the functional API so useful. You can define modular components of models and then reuse them.
We define an embedding layer and wrap it in a model.
We then define 2 input tensors, and pass each one to the same model, producing 2 output tensors.
This will use the same model, with the same layers and the same weights, for mapping each input to its corresponding output.
6. Sharing multiple layers as a model
In other words, you can take an arbitrary sequence of keras layers, and wrap them up in a model.
Once you have a model, you can re-use that model to share that sequence of steps for different input layers.
7. Let's practice!
Now you will create a shared layer using the team strength embedding model you made in the previous lesson.