BaşlayınÜcretsiz Başlayın

Understanding SparkContext

A SparkContext represents the entry point to Spark functionality. It's like a key to your car. When we run any Spark application, a driver program starts, which has the main function and your SparkContext gets initiated here. PySpark automatically creates a SparkContext for you in the PySpark shell (so you don't have to create it by yourself) and is exposed via a variable sc.

In this simple exercise, you'll find out the attributes of the SparkContext in your PySpark shell which you'll be using for the rest of the course.

Bu egzersiz

Big Data Fundamentals with PySpark

kursunun bir parçasıdır
Kursu Görüntüle

Egzersiz talimatları

  • Print the version of SparkContext in the PySpark shell.
  • Print the Python version of SparkContext in the PySpark shell.
  • What is the master of SparkContext in the PySpark shell?

Uygulamalı interaktif egzersiz

Bu örnek kodu tamamlayarak bu egzersizi bitirin.

# Print the version of SparkContext
print("The version of Spark Context in the PySpark shell is", sc.____)

# Print the Python version of SparkContext
print("The Python version of Spark Context in the PySpark shell is", ____.pythonVer)

# Print the master of SparkContext
print("The master of Spark Context in the PySpark shell is", ____.____)
Kodu Düzenle ve Çalıştır