Session
DLT Integrations and Interoperability: Get Data From — and to — Anywhere
Overview
Experience | In Person |
---|---|
Type | Breakout |
Track | Data Engineering and Streaming |
Industry | Energy and Utilities, Enterprise Technology, Manufacturing |
Technologies | Apache Spark, DLT |
Skill Level | Intermediate |
Duration | 40 min |
In this session, you will learn how to integrate DLT with external systems in order to ingest and send data virtually anywhere. DLT is most often used in ingestion and ETL into the Lakehouse. New DLT capabilities like the DLT Sinks API and added support for Python Data Source and ForEachBatch have opened up DLT to support almost any integration. This includes popular Apache Spark™ integrations like JDBC, Kafka, External and managed Delta tables, Azure CosmosDB, MongoDB and more.
Session Speakers
IMAGE COMING SOON
Ryan Nienhuis
/Sr. Staff Product Manager
Databricks