Tutorials
Discover the power of Lakehouse. Install demos in your workspace to quickly access best practices for data ingestion, governance, security, data science and data warehousing.


Tutorials quickstart
Install demos directly from your Databricks notebooks
Load the dbdemos package in a cell
List and install any demo
Explore all tutorials

Spark Streaming - Advanced
Deep dive on Spark Streaming with Delta to build webapp user sessions from clicks, with custom aggregation state management.

Full Delta Live Tables Pipeline - Loan
Ingest loan data and implement a DLT pipeline with quarantine.

Orchestrate and Run Your DBT Jobs
Launch your dbt pipelines in production using a SQL Warehouse. Leverage Databricks Workflow (orchestration) and add a dbt task in your transformation pipeline.

CDC Pipeline with Delta Live Table
Ingest Change Data Capture flow with APPLY INTO and simplify SCDT2 implementation.

Unit Testing Delta Live Tables (DLT) for Production-Grade Pipelines
Deploy robust Delta Live Table pipelines with unit tests leveraging expectation.

Delta Lake
Store your table with Delta Lake & discover how Delta Lake can simplify your Data Pipelines.

Databricks Auto Loader (Cloudfile)
Incremental ingestion on your cloud storage folder.

CDC Pipeline with Delta
Process CDC data to build an entire pipeline and materialize your operational tables in your lakehouse.

Lakeflow Declarative Pipeline - Introduction
Ingest real-time bike data to optimize rentals and enable predictive maintenance.

MLOps - End-to-End Pipeline
Automate your model deployment with MLFlow webhook & repo, end 2 end!

Pandas API with Spark Backend (Koalas)
Let you Data Science team scale to TB of data while working with Pandas API, without having to learn & move to another framework.

Feature Store and Online Inference
Leverage Databricks Feature Store with streaming and online store.

Image Classification - Deep Learning for Default Detection
Deep Learning using Databricks Lakehouse: detect defaults in PCBs with Hugging Face transformers and PyTorch Lightning.

Build High-Quality RAG Apps with Mosaic AI Agent Framework and Agent Evaluation, Model Serving, and Vector Search
Learn how to create and deploy a real-time Q&A chatbot using Databricks retrieval augmented generation (RAG) and serverless capabilities, leveraging the DBRX Instruct Foundation Model for smart responses.

Mosaic AI Model Training: Fine-Tune Your LLM on Databricks for Specialized Tasks and Knowledge
With Databricks, you can fine-tune and deploy specialized open source LLMs that outperform baseline models, reduce costs, enhance security, and are tailored to your business needs, excelling in tasks like Named-Entity Recognition (NER).

LLM-Tools-Functions
Install this demo to discover how to build your first compound AI system step-by-step.

Databricks AI/BI: Analytics in Capital Markets With Dashboards and Genie
Discover how Databricks AI/BI can efficiently load data and enable end users to effortlessly analyze metrics in capital markets using AI/BI Genie and AI/BI Dashboards.

Databricks AI/BI: Customer Support Review With Dashboards and Genie
Databricks AI/BI loads data efficiently and — with AI/BI Genie and AI/BI Dashboards — lets end users effortlessly analyze metrics in customer support review. Learn how.

Databricks AI/BI: Genomic Patient Data Analysis With Dashboards and Genie
See how Databricks AI/BI efficiently loads genomic patient data and allows end users working with it to effortlessly perform analysis using AI/BI Genie and AI/BI Dashboards.

Databricks AI/BI: Marketing Campaign Effectiveness With Dashboards and Genie
Databricks AI/BI loads marketing campaign data efficiently and lets end users effortlessly analyze campaign effectiveness with AI/BI Genie and AI/BI Dashboards. Discover how.

Databricks AI/BI: Sales Pipeline Overview With Dashboards and Genie
For sales pipelines, Databricks AI/BI loads data efficiently and allows end users to perform analysis effortlessly using AI/BI Genie and AI/BI Dashboards. Find out how.

Databricks AI/BI: Supply Chain Optimization With Dashboards and Genie
Learn how Databricks AI/BI can efficiently load data and enable end users to effortlessly analyze supply chain optimization using AI/BI Genie and AI/BI Dashboards.

AI Functions: Query LLM with Databricks SQL
Call Azure OpenAI's model from your Lakehouse data using AI_GENERATE_TEXT().

Data Warehousing with Identity, PK/FK, Stored Proc and Loops
Unlock new SQL warehouse features, including support for schema definition with auto-increment columns, primary and foreign keys, stored procedures, loops, and more!

Monitor Your Data Quality With Lakehouse Monitoring
Learn to easily create a monitor on any table in Unity Catalog to gain insights on data trends and anomalies. This tutorial covers a retail use case, monitoring transaction data, and best practices for configuring a monitor.

Delta Sharing - Airlines
Share your data to external organization using Delta Sharing.

Upgrade Table to Unity Catalog
Discover how to upgrade your hive_metastore tables to Unity Catalog to benefit from UC capabilities: Security/ACL/Row-level/Lineage/Audit.

Data Lineage with Unity Catalog
Discover data lineage with Unity Catalog: table to table and column to column.

Table ACL & Dynamic Views with Unity Catalog
Discover to GRANT permission on your table with UC and implement more advanced control such as data masking at row-level, based on each user.

System Tables: Billing Forecast, Usage Analytics, and Access Auditing with Databricks Unity Catalog
Track and analysis usage, billing & access with UC System tables.

Access Data on External Location
Discover how you can secure files/table in external location (cloud storage like S3/ADLS/GCS) with simple GRANT command.
dbdemos is distributed as a GitHub project
For more details, please open the GitHub README.md file and follow the documentation.
dbdemos is provided as is. See the License for more information. Databricks does not offer official support for dbdemos and the associated assets.
For any issue, please open an issue and the Demo team will have a look on a best-effort basis. See the Notice for more information.