1. Learn
  2. /
  3. Courses
  4. /
  5. Deep Learning for Text with PyTorch

Exercise

Creating a transformer model

At PyBooks, the recommendation engine you're working on needs more refined capabilities to understand the sentiments of user reviews. You believe that using transformers, a state-of-the-art architecture, can help achieve this. You decide to build a transformer model that can encode the sentiments in the reviews to kickstart the project.

The following packages have been imported for you: torch, nn, optim.

The input data contains sentences such as : "I love this product", "This is terrible", "Could be better" … and their respective binary sentiment labels such as : 1, 0, 0, ...

The input data is split and converted to embeddings in the following variables: train_sentences, train_labels ,test_sentences,test_labels,token_embeddings

Instructions

100 XP
  • Initialize the transformer encoder.
  • Define the fully connected layer based on the number of sentiment classes.
  • In the forward method, pass the input through the transformer encoder followed by the linear layer.