Tokenization: sentences
Animal Farm is a popular book for middle school English teachers to assign to their students. You have decided to do some exploration on the text and provide summary statistics for teachers to use when assigning this book to their students. You already know that there are 10 chapters, but you also know that you can use tokenization to help count the number of sentences, words, and even paragraphs. In this exercise, you will use the tokenization techniques learned in the video to help split Animal Farm into sentences and count them by chapter.
Diese Übung ist Teil des Kurses
Introduction to Natural Language Processing in R
Interaktive Übung
Versuche dich an dieser Übung, indem du diesen Beispielcode vervollständigst.
# Split the text_column into sentences
animal_farm %>%
___(output = "sentences", input = text_column, token = ___)