Session

Lakeflow Jobs: Modern Orchestration for the Data Platform

Overview

ExperienceIn Person
TrackData Engineering & Streaming
IndustryManufacturing, Financial Services, Transportation
TechnologiesLakeflow
Skill LevelIntermediate

Modern data platforms need more than a scheduler. They need orchestration that understands data, integrates with the platform, and scales with both engineering and AI workloads. Join this session to see how Lakeflow Jobs helps teams orchestrate pipelines, notebooks, dbt projects, SQL, and external components, natively on the Databricks Platform.We will show how Lakeflow Jobs goes beyond basic scheduling with data-aware orchestration that understands the assets and systems it is coordinating. You will see how teams can trigger workflows based on data changes, build modular workflows in Python, orchestrate external systems alongside Databricks workloads, and use Declarative Automation Bundles (DABs) for a more reliable CI/CD path.We will also cover how Serverless Jobs and observability features such as lineage and system tables help teams reduce operational overhead and better understand what is happening across their workflows. Finally, we will preview how agentic authoring and diagnostics can speed up building, troubleshooting, and improving production jobs, and how you can embed agents directly into your workflows.Whether you are modernizing from legacy schedulers or building new workloads on the lakehouse, this session will share practical patterns, product direction, and live demos for modern orchestration on Databricks.

Session Speakers

Speaker placeholderIMAGE COMING SOON

Saad Ansari

/Product Manager
Databricks