CommencerCommencer gratuitement

Writing and reading a Delta table

Your team at Global Retail Analytics needs a reliable way to persist cleaned data between sessions. Right now, everything lives in memory and disappears when the cluster shuts down.

Use Delta Lake to save the cleaned retail data as a managed table and read it back through Unity Catalog.

Cet exercice fait partie du cours

Data Transformation with Spark SQL in Databricks

Afficher le cours

Exercice interactif pratique

Passez de la théorie à la pratique avec l’un de nos exercices interactifs

Commencer l’exercice