Writing and reading a Delta table
Your team at Global Retail Analytics needs a reliable way to persist cleaned data between sessions. Right now, everything lives in memory and disappears when the cluster shuts down.
Use Delta Lake to save the cleaned retail data as a managed table and read it back through Unity Catalog.
Diese Übung ist Teil des Kurses
Data Transformation with Spark SQL in Databricks
Interaktive Übung
In dieser interaktiven Übung kannst du die Theorie in die Praxis umsetzen.
Übung starten