Session
Lakeflow Observability: From UI Monitoring to Deep Analytics
Overview
Experience | In Person |
---|---|
Type | Breakout |
Track | Data Engineering and Streaming |
Industry | Enterprise Technology |
Technologies | Databricks Workflows, DLT, LakeFlow |
Skill Level | Intermediate |
Duration | 40 min |
Monitoring data pipelines is key to reliability at scale. In this session, we’ll dive into the observability experience in Lakeflow, Databricks’ unified DE solution — from intuitive UI monitoring to advanced event analysis, cost observability and custom dashboards.
We’ll walk through the revamped UX for Lakeflow observability, showing how to:
- Monitor runs and task states, dependencies and retry behavior in the UI
- Set up alerts for job and pipeline outcomes + failures
- Use pipeline and job system tables for historical insights
- Explore run events and event logs for root cause analysis
- Analyze metadata to understand and optimize pipeline spend
- How to build custom dashboards using system tables to track performance data quality, freshness, SLAs and failure trends, and drive automated alerting based on real-time signals
This session will help you unlock full visibility into your data workflows.
Session Speakers
IMAGE COMING SOON
Saad Ansari
/Product Management
Databricks
IMAGE COMING SOON
Theresa Hammer
/Product Manager
Databricks