Session

Getting the Most Out of Spark Declarative Pipelines: Deep Dive on What’s New and Best Practices

Overview

ExperienceIn Person
TrackData Engineering & Streaming
IndustryHealthcare & Life Sciences, Manufacturing, Financial Services
TechnologiesLakeflow
Skill LevelAdvanced

Declarative pipelines are becoming the default way teams scale production data pipelines in Spark. Maximizing their value requires understanding the execution model, tradeoffs, and best practices.

In this 90-minute-deep dive, distinguished engineer Michael Armbrust explores how to get the most out of Lakeflow Spark Declarative Pipelines (SDP), drawing on real-world usage and the latest platform advancements. You’ll learn how to:

  • Apply proven design patterns for building reliable, maintainable pipelines
  • Use SDP effectively across batch and streaming workloads
  • Avoid common pitfalls as pipeline complexity and scale increase

This session is the definitive technical deep dive for teams scaling through declarative frameworks in Spark.

Session Speakers

Speaker placeholderIMAGE COMING SOON

Michael Armbrust

/Distinguished Software Engineer
Databricks