Reading Spark configurations
You've recently configured a cluster via a cloud provider. Your only access is via the command shell or your python code. You'd like to verify some Spark settings to validate the configuration of the cluster.
The spark object is available for use.
Deze oefening maakt deel uit van de cursus
Cleaning Data with PySpark
Oefeninstructies
- Check the name of the Spark application instance ('spark.app.name').
- Determine the TCP port the driver runs on ('spark.driver.port').
- Determine how many partitions are configured for joins.
- Show the results.
Praktische interactieve oefening
Probeer deze oefening eens door deze voorbeeldcode in te vullen.
# Name of the Spark application instance
app_name = spark.____.get(____)
# Driver TCP port
driver_tcp_port = ____
# Number of join partitions
num_partitions = ____('spark.sql.shuffle.____')
# Show the results
print("Name: %s" % ____)
print("Driver TCP port: %s" % ____)
print("Number of partitions: %s" % ____)