Reading Spark configurations
You've recently configured a cluster via a cloud provider. Your only access is via the command shell or your python code. You'd like to verify some Spark settings to validate the configuration of the cluster.
The spark object is available for use.
Bu egzersiz
Cleaning Data with PySpark
kursunun bir parçasıdırEgzersiz talimatları
- Check the name of the Spark application instance ('spark.app.name').
- Determine the TCP port the driver runs on ('spark.driver.port').
- Determine how many partitions are configured for joins.
- Show the results.
Uygulamalı interaktif egzersiz
Bu örnek kodu tamamlayarak bu egzersizi bitirin.
# Name of the Spark application instance
app_name = spark.____.get(____)
# Driver TCP port
driver_tcp_port = ____
# Number of join partitions
num_partitions = ____('spark.sql.shuffle.____')
# Show the results
print("Name: %s" % ____)
print("Driver TCP port: %s" % ____)
print("Number of partitions: %s" % ____)