Session

Simplifying Data Pipelines With DLT: A Beginner’s Guide

Overview

ExperienceIn Person
TypeBreakout
TrackData Engineering and Streaming
IndustryEnterprise Technology, Health and Life Sciences, Financial Services
TechnologiesDLT, LakeFlow
Skill LevelBeginner
Duration40 min

As part of the new Lakeflow data engineering experience, DLT makes it easy to build and manage reliable data pipelines. It unifies batch and streaming, reduces operational complexity and ensures dependable data delivery at scale — from batch ETL to real-time processing.

 

DLT excels at declarative change data capture, batch and streaming workloads, and efficient SQL-based pipelines. In this session, you’ll learn how we’ve reimagined data pipelining with DLT, including:

  • A brand new pipeline editor that simplifies transformations
  • Serverless compute modes to optimize for performance or cost
  • Full Unity Catalog integration for governance and lineage
  • Reading/writing data with Kafka and custom sources
  • Monitoring and observability for operational excellence
  • “Real-time Mode” for ultra-low-latency streaming

 

Join us to see how DLT powers better analytics and AI with reliable, unified pipelines.

Session Speakers

Matt Jones

/Senior Product Marketing Manager
Databricks

IMAGE COMING SOON

Brad Turnbaugh

/Data Engineer
84.51