Components in a Spark Cluster
Spark is a distributed computing platform. It achieves efficiency by distributing data and computation across a cluster of computers.
A Spark cluster consists of a number of hardware and software components which work together.
Which of these is not part of a Spark cluster?
This exercise is part of the course
Machine Learning with PySpark
Hands-on interactive exercise
Turn theory into action with one of our interactive exercises
