Session

Lakeflow Designer: No-code ETL— Authored With AI, Built for Production

Overview

ExperienceIn Person
TrackData Engineering & Streaming
IndustryHealthcare & Life Sciences, Manufacturing, Financial Services
TechnologiesLakeflow
Skill LevelBeginner

As demand for data grows, central data teams can’t build every transformation themselves. But traditional “self-service” ETL tools often introduce new problems: logic leaves governed platforms, pipelines live in parallel runtimes, and productionization requires costly rewrites by data engineering teams.

Lakeflow Designer addresses this gap by giving analysts a no-code, AI-first way to do data prep on the same Lakeflow execution model used by data engineering teams. Instead of introducing a parallel toolchain, Designer produces governed, production-grade transformations that fit directly into existing data engineering workflows.

In this session, you’ll see how Lakeflow Designer:

  • Enables analysts to do data prep without moving data outside Databricks
  • Preserves centralized governance, quality, and lineage through Unity Catalog
  • Lets data engineers scale pipeline development by directly operationalizing designer-authored logic— without rebuilding, translating, or refactoring

Session Speakers

Speaker placeholderIMAGE COMING SOON

Jason Messer

/Product Manager
Databricks

Emanuel Zgraggen

/Staff Software Engineer
Databricks