Authoring Data Pipelines With the New DLT Editor
Overview
Experience | In Person |
---|---|
Type | Breakout |
Track | Data Engineering and Streaming |
Industry | Enterprise Technology |
Technologies | DLT, LakeFlow |
Skill Level | Intermediate |
Duration | 40 min |
We’re introducing a new developer experience for DLT designed for data practitioners who prefer a code-first approach and expect robust developer tooling. The new multi-file editor brings an IDE-like environment to declarative pipeline development, making it easy to structure transformation logic, configure pipelines throughout the development lifecycle and iterate efficiently.
Features like contextual data previews and selective table updates enable step-by-step development. UI-driven tools, such as DAG previews and DAG-based actions, enhance productivity for experienced users and provide a bridge for those transitioning to declarative workflows.
In this session, we’ll showcase the new editor in action, highlighting how these enhancements simplify declarative coding and improve development for production-ready data pipelines. Whether you’re an experienced developer or new to declarative data engineering, join us to see how DLT can enhance your data practice.
Session Speakers
IMAGE COMING SOON
Adriana Ispas
/Sr. Staff Product Manager
Databricks
Camiel Steenstra
/Staff Software Engineer
Databricks