Get startedGet started for free

Ingesting Data into Lakehouse Tables and Files

In this exercise, we'll create a data ingestion pipeline using the Copy Activity in Microsoft Fabric. We’ll configure a source dataset, set up a Lakehouse as destination, and run the pipeline to see the data ingestion in action!

This exercise is part of the course

Data Ingestion and Semantic Models with Microsoft Fabric

View Course

Hands-on interactive exercise

Turn theory into action with one of our interactive exercises

Start Exercise