LoslegenKostenlos loslegen

Embedding in PyTorch

PyBooks found success with a book recommendation system. However, it doesn't account for some of the semantics found in the text. PyTorch's built-in embedding layer can learn and represent the relationship between words directly from data. Your team is curious to explore this capability to improve the book recommendation system. Can you help implement it?

torch and torch.nn as nn have been imported for you.

Diese Übung ist Teil des Kurses

Deep Learning for Text with PyTorch

Kurs anzeigen

Anleitung zur Übung

  • Map a unique index to each word in words, saving to word_to_idx.
  • Convert word_to_idx into a PyTorch tensor and save to inputs.
  • Initialize an embedding layer using the torch module with ten dimensions.
  • Pass the inputs tensor to the embedding layer and review the output.

Interaktive Übung

Versuche dich an dieser Übung, indem du diesen Beispielcode vervollständigst.

# Map a unique index to each word
words = ["This", "book", "was", "fantastic", "I", "really", "love", "science", "fiction", "but", "the", "protagonist", "was", "rude", "sometimes"]
word_to_idx = {word: ____ for i, word in enumerate(____)}

# Convert word_to_idx to a tensor
inputs = ____.____([word_to_idx[w] for w in words])

# Initialize embedding layer with ten dimensions
embedding = nn.____(num_embeddings=len(words), embedding_dim=____)

# Pass the tensor to the embedding layer
output = embedding(____)
print(output)
Code bearbeiten und ausführen