Session

Streaming NASA Data with Lakeflow and Agent Bricks: A Data Engineer's Guide to Databricks AI

Overview

ExperienceIn Person
TrackData Engineering & Streaming
IndustryEnterprise Technology, Healthcare & Life Sciences, Public Sector
TechnologiesLakeflow, Unity Catalog, Agent Bricks
Skill LevelIntermediate

This Data + AI Summit session walks you through a streaming application that analyzes live NASA space data using Lakeflow and Agent Bricks.Lakeflow is the foundation layer of every AI application. I explain how Lakeflow Spark Declarative Pipelines (SDP) ingests from one of the very few public Kafka topics that exist and how it works in symbiosis with Agent Bricks to turn real-time NASA circulars about cosmic events into an AI-powered knowledge base with Databricks Apps UI.The AI stack covers AI Gateway, guardrails, Vector Search, agent evaluation, and MLflow tracing, all from a data engineer's perspective.The second part focuses on Time to Value. I used several Databricks AI tools: Genie Code, Claude Code, the AI Dev Kit, and others. Each capable of taking you from zero to a working application in days rather than months. I share the trade-offs and best practices learned.Grab the full application from GitHub afterward — and play with data from exploding supernovas.

Session Speakers

Speaker placeholderIMAGE COMING SOON

Frank Munz

/Principal TM Engineer
Databricks