Session
Lakeflow Declarative Pipelines Integrations and Interoperability: Get Data From — and to — Anywhere

Overview
Tuesday
June 10
1:50 pm
Experience | In Person |
---|---|
Type | Breakout |
Track | Data Engineering and Streaming |
Industry | Energy and Utilities, Enterprise Technology, Manufacturing |
Technologies | Apache Spark, DLT |
Skill Level | Intermediate |
Duration | 40 min |
This session is repeated.In this session, you will learn how to integrate Lakeflow Declarative Pipelines with external systems in order to ingest and send data virtually anywhere. Lakeflow Declarative Pipelines is most often used in ingestion and ETL into the Lakehouse. New Lakeflow Declarative Pipelines capabilities like the Lakeflow Declarative Pipelines Sinks API and added support for Python Data Source and ForEachBatch have opened up Lakeflow Declarative Pipelines to support almost any integration. This includes popular Apache Spark™ integrations like JDBC, Kafka, External and managed Delta tables, Azure CosmosDB, MongoDB and more.
Session Speakers
IMAGE COMING SOON
Ryan Nienhuis
/Sr. Staff Product Manager
Databricks