Session

Lakeflow Declarative Pipelines Integrations and Interoperability: Get Data From — and to — Anywhere

Register or Login

Overview

Tuesday

June 10

1:50 pm

ExperienceIn Person
TypeBreakout
TrackData Engineering and Streaming
IndustryEnergy and Utilities, Enterprise Technology, Manufacturing
TechnologiesApache Spark, DLT
Skill LevelIntermediate
Duration40 min

This session is repeated.In this session, you will learn how to integrate Lakeflow Declarative Pipelines with external systems in order to ingest and send data virtually anywhere. Lakeflow Declarative Pipelines is most often used in ingestion and ETL into the Lakehouse. New Lakeflow Declarative Pipelines capabilities like the Lakeflow Declarative Pipelines Sinks API and added support for Python Data Source and ForEachBatch have opened up Lakeflow Declarative Pipelines to support almost any integration. This includes popular Apache Spark™ integrations like JDBC, Kafka, External and managed Delta tables, Azure CosmosDB, MongoDB and more.

Session Speakers

IMAGE COMING SOON

Ryan Nienhuis

/Sr. Staff Product Manager
Databricks