Session

10+ reasons to use Databricks’ Delta Live Tables for your next data processing project

Overview

ExperienceIn Person
TypeLightning Talk
TrackData Engineering and Streaming
IndustryProfessional Services
TechnologiesApache Spark, Delta Lake, DLT
Skill LevelIntermediate
Delta Live Tables (DLT) 's home page says, “It’s a declarative ETL framework (...) that helps data teams simplify streaming and batch ETL cost-effectively. Simply define the transformations to perform on your data and let DLT pipelines automatically manage task orchestration, cluster management, monitoring, data quality and error handling.This talk aims to show you how DLT saved me a lot of trouble while on a tight delivery schedule. I’ll show you why the DLT headline is correct. In other words, I hope I will convince you to consider the DLT framework for your next ETL project.I found over 10 reasons why investing in DLT for your next project is worth your time.I will discuss the foundational concepts (Spark SQL and Structured Streaming, Delta Lake) and, more importantly, how they paved the way for Delta Live Tables. The talk is based on my recent experience with two successful projects, which have done very well from their humble beginnings and were so much fun to be part of.

Session Speakers

Jacek Laskowski

/Freelance Data Engineer
japila.pl