Session

Data Triggers and Advanced Control Flow With Lakeflow Jobs

Register or Login

Overview

Wednesday

June 11

1:50 pm

ExperienceIn Person
TypeBreakout
TrackData Engineering and Streaming
IndustryEnterprise Technology
TechnologiesDatabricks Workflows, LakeFlow
Skill LevelIntermediate
Duration40 min

Lakeflow Jobs is the production-ready fully managed orchestrator for the entire Lakehouse with 99.95% uptime. Join us for a dive into how you can orchestrate your enterprise data operations, from triggering your jobs only when your data is ready to advanced control flow with conditionals, looping and job modularity — with demos!

 

Attendees will gain practical insights into optimizing their data operations by orchestrating with Lakeflow Jobs:

  • New task types: Publish AI/BI Dashboards, push to Power BI or ingest with Lakeflow Connect
  • Advanced execution control: Reference SQL Task outputs, run partial DAGs and perform targeted backfills
  • Repair runs: Re-run failed pipelines with surgical precision using task-level repair
  • Control flow upgrades: Native for-each loops and conditional logic make DAGs more dynamic + expressive
  • Smarter triggers: Kick off jobs based on file arrival or Delta table changes, enabling responsive workflows
  • Code-first approach to pipeline orchestration

Session Speakers

Prashanth Babu Velanati Venkata

/Product Specialist
Databricks

Anthony Podgorsak

/Product Manager
Databricks