CommencerCommencer gratuitement

Bringing it all together I

You've built a solid foundation in PySpark, explored its core components, and worked through practical scenarios involving Spark SQL, DataFrames, and advanced operations. Now it’s time to bring it all together. Over the next two exercises, you're going to make a SparkSession, a Dataframe, cache that Dataframe, conduct analytics and explain the outcome!

Cet exercice fait partie du cours

Introduction to PySpark

Afficher le cours

Instructions

  • Import SparkSession from pyspark.sql.
  • Make a new SparkSession called final_spark using SparkSession.builder.getOrCreate().
  • Print my_spark to the console to verify it's a SparkSession.
  • Create a new DataFrame from a preloaded schema and column definition.

Exercice interactif pratique

Essayez cet exercice en complétant cet exemple de code.

# Import SparkSession from pyspark.sql
from ____ import ____

# Create my_spark
my_spark = SparkSession.builder.appName(____).____

# Print my_spark
____

# Load dataset into a DataFrame
df = ____(data, schema=columns)

df.show()
Modifier et exécuter le code