Using DLT Kafka Sinks
Type
On-Demand Video
Duration
4 minutes 13 seconds
Related Content
What you’ll learn
DLT Sinks let you write data directly to Apache Kafka, making real-time data publishing easier.
Simply use create_sink() to define your Kafka connection and topic configuration details, followed by append_flow() to stream data to Kafka topics continuously. At the same time, DLT handles all the complexity of reliable message delivery and scaling.
Note - at Data + AI Summit in June 2025, Databricks released Lakeflow. Lakeflow unifies Data Engineering with Lakeflow Connect, Lakeflow Declarative Pipelines (previously known as DLT), and Lakeflow Jobs (previously known as Workflows).