Word vectors projection
You can visualize word vectors in a scatter plot to help you understand how the vocabulary words are grouped. In order to visualize word vectors, you need to project them into a two-dimensional space. You can project vectors by extracting the two principal components via Principal Component Analysis (PCA).
In this exercise, you will practice how to extract word vectors and project them into two-dimensional space using the PCA library from sklearn.
A short list of words that are stored in the words list and the en_core_web_md model are available for use. The model is loaded as nlp. All necessary libraries and packages are already imported for your use (PCA, numpy as np).
Diese Übung ist Teil des Kurses
Natural Language Processing with spaCy
Anleitung zur Übung
- Extract the word IDs from the given words and store them in the
word_idslist. - Extract the first five elements of the word vectors of the words and then stack them vertically using
np.vstack()inword_vectors. - Given a
pcaobject, calculate the transformed word vectors using the.fit_transform()function of thepcaclass. - Print the first component of the transformed word vectors using
[:, 0]indexing.
Interaktive Übung
Vervollständige den Beispielcode, um diese Übung erfolgreich abzuschließen.
words = ["tiger", "bird"]
# Extract word IDs of given words
word_ids = [nlp.____.____[w] for w in words]
# Extract word vectors and stack the first five elements vertically
word_vectors = np.vstack([nlp.____.____[i][:5] for i in word_ids])
# Calculate the transformed word vectors using the pca object
pca = PCA(n_components=2)
word_vectors_transformed = pca.____(____)
# Print the first component of the transformed word vectors
print(____[:, 0])