Get startedGet started for free

Embedding in PyTorch

PyBooks found success with a book recommendation system. However, it doesn't account for some of the semantics found in the text. PyTorch's built-in embedding layer can learn and represent the relationship between words directly from data. Your team is curious to explore this capability to improve the book recommendation system. Can you help implement it?

torch and torch.nn as nn have been imported for you.

This exercise is part of the course

Deep Learning for Text with PyTorch

View Course

Exercise instructions

  • Map a unique index to each word in words, saving to word_to_idx.
  • Convert word_to_idx into a PyTorch tensor and save to inputs.
  • Initialize an embedding layer using the torch module with ten dimensions.
  • Pass the inputs tensor to the embedding layer and review the output.

Hands-on interactive exercise

Have a go at this exercise by completing this sample code.

# Map a unique index to each word
words = ["This", "book", "was", "fantastic", "I", "really", "love", "science", "fiction", "but", "the", "protagonist", "was", "rude", "sometimes"]
word_to_idx = {word: ____ for i, word in enumerate(____)}

# Convert word_to_idx to a tensor
inputs = ____.____([word_to_idx[w] for w in words])

# Initialize embedding layer with ten dimensions
embedding = nn.____(num_embeddings=len(words), embedding_dim=____)

# Pass the tensor to the embedding layer
output = embedding(____)
print(output)
Edit and Run Code