Session

Agentic Data Engineering with Lakeflow, Genie Code, and IDEs

Overview

ExperienceIn Person
TrackData Engineering & Streaming
IndustryEnterprise Technology
TechnologiesLakeflow, Agent Bricks
Skill LevelBeginner

AI agents are changing how data engineering gets done. They can reduce manual work across the lifecycle of building and operating production pipelines, including authoring jobs and pipelines, testing changes, troubleshooting failures, and resolving incidents.In this session, we show how Genie Code can be used with Lakeflow for data engineering. We demonstrate how builders can iterate on Spark Declarative Pipelines (SDP) and Jobs with Genie Code, accelerate authoring, and use agents to investigate failures by tracing logs, dependencies, and candidate fixes.We also cover how local development tools can be used with Lakeflow, including IDEs and tools like Cursor, Claude Code, and Codex. This lets teams use the tools they already have for Lakeflow development and troubleshooting.Finally, we show how to safely adopt agents in real environments through sandboxing and guardrails to protect production data, as well as standard engineering practices such as source control, code review, testing, and CI/CD.

Session Speakers

Speaker placeholderIMAGE COMING SOON

Gal Oshri

/Sr. Staff Product Manager
Databricks

Speaker placeholderIMAGE COMING SOON

Lennart Kats

/Databricks