Session
Lakeflow Designer: No-code ETL— Authored With AI, Built for Production
Overview
| Experience | In Person |
|---|---|
| Track | Data Engineering & Streaming |
| Industry | Healthcare & Life Sciences, Manufacturing, Financial Services |
| Technologies | Lakeflow |
| Skill Level | Beginner |
As demand for data grows, central data teams can’t build every transformation themselves. But traditional “self-service” ETL tools often introduce new problems: logic leaves governed platforms, pipelines live in parallel runtimes, and productionization requires costly rewrites by data engineering teams.
Lakeflow Designer addresses this gap by giving analysts a no-code, AI-first way to do data prep on the same Lakeflow execution model used by data engineering teams. Instead of introducing a parallel toolchain, Designer produces governed, production-grade transformations that fit directly into existing data engineering workflows.
In this session, you’ll see how Lakeflow Designer:
- Enables analysts to do data prep without moving data outside Databricks
- Preserves centralized governance, quality, and lineage through Unity Catalog
- Lets data engineers scale pipeline development by directly operationalizing designer-authored logic— without rebuilding, translating, or refactoring
Session Speakers
Jason Messer
/Product Manager
Databricks
Emanuel Zgraggen
/Staff Software Engineer
Databricks