Streaming retail data into Delta Lake
New order files land daily in your team's volume at Global Retail Analytics.
Instead of reloading everything each morning, you'll build a streaming pipeline that picks up new CSV files automatically, writes them to Delta Lake, and recovers gracefully from restarts using checkpoints.
Diese Übung ist Teil des Kurses
Data Transformation with Spark SQL in Databricks
Interaktive Übung
In dieser interaktiven Übung kannst du die Theorie in die Praxis umsetzen.
Übung starten