Building a Data Application With Lakeflow

Type

On-Demand Video

Duration

4 minutes 53 seconds

Social

What you’ll learn

This video demonstrates a comprehensive approach to building data applications using Lakeflow, emphasizing key concepts such as data ingestion, transformation, and hyperpersonalization. It illustrates how to integrate data from disparate systems, including Salesforce, Microsoft SQL Server, and legacy mainframes, into a unified lakehouse. Further, you will see the use of declarative pipelines for efficient data processing, showcasing how to create per-customer recommendations and monitor job performance, cost, and data quality in a production environment.

 

Note: Databricks Lakeflow unifies Data Engineering with Lakeflow Connect, Lakeflow Spark Declarative Pipelines (previously known as DLT), and Lakeflow Jobs (previously known as Workflows).

Ready to get started?