Exercise

Understanding SparkContext

A SparkContext represents the entry point to Spark functionality. It's like a key to your car. When we run any Spark application, a driver program starts, which has the main function and your SparkContext gets initiated here. PySpark automatically creates a SparkContext for you in the PySpark shell (so you don't have to create it by yourself) and is exposed via a variable sc.

In this simple exercise, you'll find out the attributes of the SparkContext in your PySpark shell which you'll be using for the rest of the course.

Instructions

100 XP
  • Print the version of SparkContext in the PySpark shell.
  • Print the Python version of SparkContext in the PySpark shell.
  • What is the master of SparkContext in the PySpark shell?