Data Ingestion with Databricks Data Intelligence Platform
Demo Type
Product Tutorial
Duration
Self-paced
Related Content
What you’ll learn
With the Databricks Data Intelligence Platform, you can easily ingest data from any source into one unified Lakehouse. Whether batch, streaming, or change data capture (CDC), Databricks offers fast, reliable, and scalable tools to bring all your data together.
Explore data ingestion with Databricks including Lakeflow Connect, SQL read files, and Auto Loader. Key capabilities include:
Universal connectivity to apps, databases, and cloud storage
Automated and incremental ingestion with Auto Loader and Lakeflow Connect
Real-time readiness for instant insights and dashboards
Built-in governance through Unity Catalog
Simplified pipelines using Lakeflow Declarative Pipelines
To install the demo, get a free Databricks workspace and execute the following two commands in a Python notebook
Dbdemos is a Python library that installs complete Databricks demos in your workspaces. Dbdemos will load and start notebooks, DLT pipelines, clusters, Databricks SQL dashboards, warehouse models … See how to use dbdemos
Dbdemos is distributed as a GitHub project.
For more details, please view the GitHub README.md file and follow the documentation.
Dbdemos is provided as is. See the License and Notice for more information.
Databricks does not offer official support for dbdemos and the associated assets.
For any issue, please open a ticket and the demo team will have a look on a best-effort basis.
Note - at Data + AI Summit in June 2025, Databricks released Lakeflow. Lakeflow unifies Data Engineering with Lakeflow Connect, Lakeflow Declarative Pipelines (previously known as DLT), and Lakeflow Jobs (previously known as Workflows).

