Session
Pipeline to Production in Minutes: How a Leading P&C Insurer Achieved End-to-End DevOps on Databricks
Overview
| Experience | In Person |
|---|---|
| Track | Application Development |
| Industry | Financial Services |
| Technologies | Unity Catalog, Databricks Apps |
| Skill Level | Beginner |
P&C insurance data platform must deliver analytics rapidly while meeting stringent governance and regulatory requirements. Although Databricks is widely adopted as the analytics backbone, many P&C organizations continue to struggle with inconsistent deployments, manual promotions, and fragmented DevOps practices.This lightning talk demonstrates how a P&C organization implemented true end-to-end DevOps on Databricks using Asset Bundles, Databricks Apps, and GitHub Actions.Learn how a Git-driven, declarative deployment model uses Asset Bundles to package notebooks, workflows and DLT pipelines with environment-aware configuration, enabling consistent promotion across multiple environments, while GitHub Actions orchestrates CI/CD pipelines for validation, automated deployment and auditability.The session highlights how Databricks Apps are integrated into the same pipeline to operationalize analytics use cases as secure, production-ready data product governed by Unity Catalog.
Session Speakers
Harshit Mishra
/Sr Consultant Data Engineer
Nationwide Insurance