Get startedGet started for free

Tokenization: sentences

Animal Farm is a popular book for middle school English teachers to assign to their students. You have decided to do some exploration on the text and provide summary statistics for teachers to use when assigning this book to their students. You already know that there are 10 chapters, but you also know that you can use tokenization to help count the number of sentences, words, and even paragraphs. In this exercise, you will use the tokenization techniques learned in the video to help split Animal Farm into sentences and count them by chapter.

This exercise is part of the course

Introduction to Natural Language Processing in R

View Course

Hands-on interactive exercise

Have a go at this exercise by completing this sample code.

# Split the text_column into sentences
animal_farm %>%
  ___(output = "sentences", input = text_column, token = ___)
Edit and Run Code