Session

Simplifying Data Pipelines With Lakeflow Declarative Pipelines: A Beginner’s Guide

Register or Login

Overview

Tuesday

June 10

9:10 am

ExperienceIn Person
TypeBreakout
TrackData Engineering and Streaming
IndustryEnterprise Technology, Health and Life Sciences, Financial Services
TechnologiesDLT, LakeFlow
Skill LevelBeginner
Duration40 min

As part of the new Lakeflow data engineering experience, Lakeflow Declarative Pipelines makes it easy to build and manage reliable data pipelines. It unifies batch and streaming, reduces operational complexity and ensures dependable data delivery at scale — from batch ETL to real-time processing.Lakeflow Declarative Pipelines excels at declarative change data capture, batch and streaming workloads, and efficient SQL-based pipelines. In this session, you’ll learn how we’ve reimagined data pipelining with Lakeflow Declarative Pipelines, including:

  • A brand new pipeline editor that simplifies transformations
  • Serverless compute modes to optimize for performance or cost
  • Full Unity Catalog integration for governance and lineage
  • Reading/writing data with Kafka and custom sources
  • Monitoring and observability for operational excellence
  • “Real-time Mode” for ultra-low-latency streaming

Join us to see how Lakeflow Declarative Pipelines powers better analytics and AI with reliable, unified pipelines.

Session Speakers

Brad Turnbaugh

/Data Engineer
84.51

Matt Jones

/Senior Product Marketing Manager
Databricks