Downloading multiple files using curl
We have 100 data files stored in long sequentially named URLs. Scroll right to see the complete URLs.
https://s3.amazonaws.com/assets.datacamp.com/production/repositories/4180/datasets/files/datafile001.txt
https://s3.amazonaws.com/assets.datacamp.com/production/repositories/4180/datasets/files/datafile002.txt
......
https://s3.amazonaws.com/assets.datacamp.com/production/repositories/4180/datasets/files/datafile100.txt
To minimize having to type the long URLs over and over again, we'd like to download all of these files using a single curl
command.
This is a part of the course
“Data Processing in Shell”
Exercise instructions
- Download all 100 data files using a single
curl
command. - Print all downloaded files to directory.
Hands-on interactive exercise
Have a go at this exercise by completing this sample code.
# Download all 100 data files
curl ___ https://s3.amazonaws.com/assets.datacamp.com/production/repositories/4180/datasets/files/datafile___.txt
# Print all downloaded files to directory
ls datafile*.txt
This exercise is part of the course
Data Processing in Shell
Learn powerful command-line skills to download, process, and transform data, including machine learning pipeline.
In this chapter, we learn how to download data files from web servers via the command line. In the process, we also learn about documentation manuals, option flags, and multi-file processing.
Exercise 1: Downloading data using curlExercise 2: Using curl documentationExercise 3: Downloading single file using curlExercise 4: Downloading multiple files using curlExercise 5: Downloading data using WgetExercise 6: Installing WgetExercise 7: Downloading single file using wgetExercise 8: Advanced downloading using WgetExercise 9: Setting constraints for multiple file downloadsExercise 10: Creating wait time using WgetExercise 11: Data downloading with Wget and curlWhat is DataCamp?
Learn the data skills you need online at your own pace—from non-coding essentials to data science and machine learning.