The Write Way: Building Bi-Directional Data Pipelines in the Lakehouse
Overview
| Experience | In Person |
|---|---|
| Track | Data Engineering & Streaming |
| Industry | Energy & Utilities, Healthcare & Life Sciences, Consulting & Services |
| Technologies | Lakeflow, Databricks Apps, Lakebase |
| Skill Level | Beginner |
Ever wondered how to bridge the gap between analytical data and operational workflows without breaking your lakehouse architecture? What if business users could edit data directly in a web app while maintaining a single source of truth? Traditional lakehouses excel at analytics but struggle with user-driven updates, often creating silos or complex workarounds.There’s a better way—in this session we’ll cover a full writeback loop where lakehouse (Unity Catalog) syncs to PostgreSQL (Lakebase) for operational speed, a Databricks app offers an intuitive editing interface and Lakeflow jobs sync changes back every 5 minutes.The lakehouse stays the source of truth while users get flexibility. We’ll explore scenarios including form-based data entry, table editing and data quality workflows, all built with Dash and Databricks Asset Bundles. By the end, you'll have a clear blueprint for building bi-directional data flows that empower users without sacrificing governance or reliability.
Session Speakers
Falek Miah
/Mr
Advancing Analytics