Session

Forecasting at Databricks: One Framework Behind Consumption and Infra Cost

Overview

ExperienceIn Person
TrackAnalytics & BI
IndustryEnterprise Technology
TechnologiesAI/BI
Skill LevelIntermediate

Consumption forecasting and infrastructure cost forecasting at Databricks share three challenges: trust and stakeholder adoption, model accuracy under hierarchy and outliers, and data signal quality. We addressed them on a unified Databricks-native stack. For trust and adoption: a self-serve human-in-the-loop Databricks App lets infra cost owners review recommended forecasts and inject inorganic shifts the model won’t see, while  AI/BI dashboards and Genie Spaces let users self-serve insights, and decomposition keeps consumption forecasts explainable. For model accuracy: Unified Forecasting Framework, the backend behind AI_Forecast() function scheduled on Lakeflow Jobs validates model selection across the hierarchies. For data and signal quality: Unity Catalog, Metric Views, and Lakeflow form the governed data backbone for cost forecasting and the feature-selection substrate for consumption. Attendees leave with Databricks-native patterns for forecasts domain owners self-serve and leadership defends.

Session Speakers

Speaker placeholderIMAGE COMING SOON

Ginger Holt

/Sr Staff Data Scientist
Databricks

Alex Wang

/Sr. Data Scientist
Databricks