Reading Spark configurations
You've recently configured a cluster via a cloud provider. Your only access is via the command shell or your python code. You'd like to verify some Spark settings to validate the configuration of the cluster.
The spark object is available for use.
Questo esercizio fa parte del corso
Cleaning Data with PySpark
Istruzioni dell'esercizio
- Check the name of the Spark application instance ('spark.app.name').
- Determine the TCP port the driver runs on ('spark.driver.port').
- Determine how many partitions are configured for joins.
- Show the results.
Esercizio pratico interattivo
Prova a risolvere questo esercizio completando il codice di esempio.
# Name of the Spark application instance
app_name = spark.____.get(____)
# Driver TCP port
driver_tcp_port = ____
# Number of join partitions
num_partitions = ____('spark.sql.shuffle.____')
# Show the results
print("Name: %s" % ____)
print("Driver TCP port: %s" % ____)
print("Number of partitions: %s" % ____)