Get startedGet started for free

Word vectors projection

You can visualize word vectors in a scatter plot to help you understand how the vocabulary words are grouped. In order to visualize word vectors, you need to project them into a two-dimensional space. You can project vectors by extracting the two principal components via Principal Component Analysis (PCA).

In this exercise, you will practice how to extract word vectors and project them into two-dimensional space using the PCA library from sklearn.

A short list of words that are stored in the words list and the en_core_web_md model are available for use. The model is loaded as nlp. All necessary libraries and packages are already imported for your use (PCA, numpy as np).

This exercise is part of the course

Natural Language Processing with spaCy

View Course

Exercise instructions

  • Extract the word IDs from the given words and store them in the word_ids list.
  • Extract the first five elements of the word vectors of the words and then stack them vertically using np.vstack() in word_vectors.
  • Given a pca object, calculate the transformed word vectors using the .fit_transform() function of the pca class.
  • Print the first component of the transformed word vectors using [:, 0] indexing.

Hands-on interactive exercise

Have a go at this exercise by completing this sample code.

words = ["tiger", "bird"]

# Extract word IDs of given words
word_ids = [nlp.____.____[w] for w in words]

# Extract word vectors and stack the first five elements vertically
word_vectors = np.vstack([nlp.____.____[i][:5] for i in word_ids])

# Calculate the transformed word vectors using the pca object
pca = PCA(n_components=2)
word_vectors_transformed = pca.____(____)

# Print the first component of the transformed word vectors
print(____[:, 0])
Edit and Run Code