Cage match, part 2! Negative reviews
In both organizations, people mentioned "culture" and "smart people", so there are some similar positive aspects between the two companies. However, with the pyramid plot, you can start to infer degrees of positive features of the work environments.
You now decide to turn your attention to negative reviews and make the same visual. This time you already have the common_words
data frame in your workspace. However, the common bigrams in this exercise come from negative employee reviews.
Este ejercicio forma parte del curso
Text Mining with Bag-of-Words in R
Instrucciones del ejercicio
- Using
slice_max()
oncommon_words
, obtain the top5
bigrams referring to thediff
column. The results of the new object will print to your console. - Create a
pyramid.plot()
. Pass intop5_df$AmazonNeg
,top5_df$GoogleNeg
, andlabels = top5_df$terms
. For better labeling, setgap
to12
.top.labels
toc("Amzn", "Neg Words", "Goog")
The main
and unit
arguments are set for you.
Ejercicio interactivo práctico
Prueba este ejercicio completando el código de muestra.
# Extract top 5 common bigrams
(top5_df <- ___ %>% ___(___, n = ___))
# Create a pyramid plot
___(
# Amazon on the left
top5_df$___,
# Google on the right
top5_df$___,
# Use terms for labels
labels = top5_df$___,
# Set the gap to 12
___ = ___,
# Set top.labels to "Amzn", "Neg Words" & "Goog"
___ = ___,
main = "Words in Common",
unit = NULL
)