Components in a Spark Cluster
Spark is a distributed computing platform. It achieves efficiency by distributing data and computation across a cluster of computers.
A Spark cluster consists of a number of hardware and software components which work together.
Which of these is not part of a Spark cluster?
Bu egzersiz
Machine Learning with PySpark
kursunun bir parçasıdırUygulamalı interaktif egzersiz
İnteraktif egzersizlerimizden biriyle teoriyi pratiğe dökün
Egzersizi başlat