IniziaInizia gratis

Embedding in PyTorch

PyBooks found success with a book recommendation system. However, it doesn't account for some of the semantics found in the text. PyTorch's built-in embedding layer can learn and represent the relationship between words directly from data. Your team is curious to explore this capability to improve the book recommendation system. Can you help implement it?

torch and torch.nn as nn have been imported for you.

Questo esercizio fa parte del corso

Deep Learning for Text with PyTorch

Visualizza il corso

Istruzioni dell'esercizio

  • Map a unique index to each word in words, saving to word_to_idx.
  • Convert word_to_idx into a PyTorch tensor and save to inputs.
  • Initialize an embedding layer using the torch module with ten dimensions.
  • Pass the inputs tensor to the embedding layer and review the output.

Esercizio pratico interattivo

Prova a risolvere questo esercizio completando il codice di esempio.

# Map a unique index to each word
words = ["This", "book", "was", "fantastic", "I", "really", "love", "science", "fiction", "but", "the", "protagonist", "was", "rude", "sometimes"]
word_to_idx = {word: ____ for i, word in enumerate(____)}

# Convert word_to_idx to a tensor
inputs = ____.____([word_to_idx[w] for w in words])

# Initialize embedding layer with ten dimensions
embedding = nn.____(num_embeddings=len(words), embedding_dim=____)

# Pass the tensor to the embedding layer
output = embedding(____)
print(output)
Modifica ed esegui il codice