Databricks on Databricks: Build Your Own Experimentation Platform
Overview
| Experience | In Person |
|---|---|
| Track | Governance & Security |
| Industry | Enterprise Technology |
| Technologies | Unity Catalog |
| Skill Level | Intermediate |
Commercial experimentation platforms promise everything – feature flags, metrics, and analysis, but often limit flexibility when you want to go beyond the basics.
At Databricks, we took a different approach: we built our own full-stack experimentation platform entirely on our internal Databricks and used it to power hundreds of product releases every month.
In this talk, we’ll show how you can do the same.
You’ll learn how we:
- Use Unity Catalog to centralize and govern feature enablement data
- Define reusable, trustworthy metrics with our internal Metrics Platform
- Run scalable experiment analyses with Lakeflow Serverless Jobs
- Deliver fast, intuitive user experiences with Lakehouse Apps
Beyond the platform itself, we’ll share how we extend it with advanced methodologies that go far beyond what commercial tools support.
Whether you’re a data scientist or platform engineer, you’ll walk away with a blueprint for building a reliable, extensible, and fully owned experimentation platform on Databricks.
Session Speakers
Nicholas Stanisha
/Data Scientist
Databricks
Donghan Zhang
/Sr. Software Engineer
Databricks