Reading Spark configurations
You've recently configured a cluster via a cloud provider. Your only access is via the command shell or your python code. You'd like to verify some Spark settings to validate the configuration of the cluster.
The spark
object is available for use.
This exercise is part of the course
Cleaning Data with PySpark
Exercise instructions
- Check the name of the Spark application instance ('spark.app.name').
- Determine the TCP port the driver runs on ('spark.driver.port').
- Determine how many partitions are configured for joins.
- Show the results.
Hands-on interactive exercise
Have a go at this exercise by completing this sample code.
# Name of the Spark application instance
app_name = spark.____.get(____)
# Driver TCP port
driver_tcp_port = ____
# Number of join partitions
num_partitions = ____('spark.sql.shuffle.____')
# Show the results
print("Name: %s" % ____)
print("Driver TCP port: %s" % ____)
print("Number of partitions: %s" % ____)