Get startedGet started for free

Creating RDDs

In PySpark, you can create an RDD (Resilient Distributed Dataset) in a few different ways. Since you are already familiar with DataFrames, you will set this up using a DataFrame. Remember, there's already a SparkSession called spark in your workspace!

This exercise is part of the course

Introduction to PySpark

View Course

Exercise instructions

  • Create a DataFrame from the provided list called df.
  • Convert the DataFrame to an RDD.
  • Collect and print the resulting RDD.

Hands-on interactive exercise

Have a go at this exercise by completing this sample code.

# Create a DataFrame
df = spark.____("salaries.csv", header=True, inferSchema=True)

# Convert DataFrame to RDD
rdd = df.____

# Show the RDD's contents
rdd.____
print(rdd)
Edit and Run Code