Skip to main content

Delta Lake Data Ingestion Demo

Delta Lake Data Ingestion Demo

With Databricks Auto Loader, you can incrementally and efficiently ingest new batch and real-time streaming data files into your Delta Lake tables as soon as they arrive in your data lake — so that they always contain the most complete and up-to-date data available. Auto Loader is a simple, flexible tool that can be run continuously, or in "triggerOnce" mode to process data in batches. SQL users can use the simple "COPY INTO" command to pull new data into their Delta Lake tables automatically, without the need to keep track of which files have already been processed.

See full list of demos

Download the notebooks

Dive deeper into the Databricks Platform

Delta Lake Cheat sheet

Hassle-Free Data Ingestion

Video transcript

Getting data into Delta Lake with Auto Loader

Loading raw data into a data warehouse can be a messy, complicated process, but with Databricks, filling your Delta Lake with the freshest data available has never been easier.

Here, we're working with some JSON telemetry data from IoT devices like smart watches that track steps. New data files are landing in our data lake every 5 seconds, so we need a way to automatically ingest them into Delta Lake. Auto Loader provides a new Structured Streaming data source called "cloudFiles" that we can use to do just that.

Create your Databricks account

Ready to get started?