Delta Live Tables Overview
In this demo, we give you a first look at Delta Live Tables, a cloud service that makes reliable ETL – extract, transform and load capabilities – easy on Delta Lake. It helps data engineering teams streamline ETL development with a simple UI and declarative tooling, improve data reliability through defined data quality rules and bad data monitoring, and scale operations with deep visibility through an event log.
Streaming Data With Delta Live Tables
Let’s analyze tweets from Data + AI Summit 2022! Modern data engineering requires a more advanced data lifecycle for data ingestion, transformation and processing. In this session, you can learn how the Databricks Lakehouse Platform provides an end-to-end data engineering solution that automates the complexity of building and maintaining data pipelines. Enjoy a fun, live, streaming data example with a Twitter data stream, Databricks Auto Loader and Delta Live Tables as well as Hugging Face sentiment analysis.
How to Create Low Latency Streaming Data Pipelines With Apache Kafka or Amazon Kinesis and Delta Live Tables
As shown at the Current.io 2022 conference in Austin (the next generation of Kafka Summit), this live demo elaborates on how the Databricks Lakehouse Platform simplifies data streaming to deliver streaming analytics and applications on one platform. Learn how to build low latency streaming data pipelines that ingest from a message bus like Confluent Cloud, Apache Kafka or any other Kafka-compatible cloud service such as Amazon MSK. The same principles can be used to ingest data from Amazon Kinesis. Frank’s full conference session spans Spark on Databricks, Spark Structured Streaming with Delta Lake, and Delta Live Tables. Slides, code and a blog posting are available.