MulaiMulai sekarang secara gratis

Embedding in PyTorch

PyBooks found success with a book recommendation system. However, it doesn't account for some of the semantics found in the text. PyTorch's built-in embedding layer can learn and represent the relationship between words directly from data. Your team is curious to explore this capability to improve the book recommendation system. Can you help implement it?

torch and torch.nn as nn have been imported for you.

Latihan ini adalah bagian dari kursus

Deep Learning for Text with PyTorch

Lihat Kursus

Petunjuk latihan

  • Map a unique index to each word in words, saving to word_to_idx.
  • Convert word_to_idx into a PyTorch tensor and save to inputs.
  • Initialize an embedding layer using the torch module with ten dimensions.
  • Pass the inputs tensor to the embedding layer and review the output.

Latihan interaktif praktis

Cobalah latihan ini dengan menyelesaikan kode contoh berikut.

# Map a unique index to each word
words = ["This", "book", "was", "fantastic", "I", "really", "love", "science", "fiction", "but", "the", "protagonist", "was", "rude", "sometimes"]
word_to_idx = {word: ____ for i, word in enumerate(____)}

# Convert word_to_idx to a tensor
inputs = ____.____([word_to_idx[w] for w in words])

# Initialize embedding layer with ten dimensions
embedding = nn.____(num_embeddings=len(words), embedding_dim=____)

# Pass the tensor to the embedding layer
output = embedding(____)
print(output)
Edit dan Jalankan Kode