Get startedGet started for free

Editing Lakehouse Data

1. Editing Lakehouse Data

In the previous video, we saw that Lakehouse data couldn’t be updated using the SQL Analytics Endpoint. In this video, we’ll look at how you can update your lakehouse data.

2. Read-Only lakehouses

Let’s begin by returning to this image. This shows that when using Lakehouses, the SQL Analytics Endpoint is read-only, while Warehouses have read and write access. But what does read-only actually mean? What types of SQL statements are allowed, and what are not allowed?

3. Data Manipulation Language (DML) Statements

There is a group of SQL keywords called Data Manipulation Language statements, or DML for short. These keywords, like UPDATE, INSERT, and DELETE, all focus on updating table data. The SQL Analytics Endpoint has no DML capabilities. This makes sense; think about the name of the SQL Analytics Endpoint! It’s for analysis, not data manipulation.

4. DML statements in notebooks: Spark SQL

So, how can you update data found in a lakehouse? Are you stuck with the data that you initially added to your tables? Luckily, you can use notebooks to edit your lakehouse data. Recall that notebooks can interact with your data using various languages, including Python and SQL. The exact same SQL code that failed in the SQL Analytics Endpoint will succeed when using a Spark SQL cell in a notebook.

5. DML statements in notebooks: PySpark

Using Python is a bit more complicated, but the fundamental concepts remain the same. You can use PySpark to write new data to your tables. In fact, there are a few different modes that will add data in different ways. For example, If you want to completely overwrite existing data, you can use ”overwrite” mode. On the other hand, ”append” mode will add new data to the end of your existing table. Again, we’ll stress the specifics of the programming language shouldn’t be your focus in this course. Instead, we hope you begin to understand the different tools in the Fabric environment and how they interact with lakehouses and warehouses.

6. Let's practice!

Let’s jump into some exercises where you’ll work in both the SQL Analytics Endpoint and a Notebooks.

Create Your Free Account

or

By continuing, you accept our Terms of Use, our Privacy Policy and that your data is stored in the USA.