Skip to main content

Automated Deployment with Declarative Automation Bundles

This course provides a comprehensive review of DevOps principles and their application to Databricks projects. It begins with an overview of core DevOps, DataOps, continuous integration (CI), continuous deployment (CD), and testing, and explores how these principles can be applied to data engineering pipelines.


The course then focuses on continuous deployment within the CI/CD process, examining tools like the Databricks REST API, SDK, and CLI for project deployment. You will learn about Declarative Automation Bundles (DABs) and how they fit into the CI/CD process. You’ll dive into their key components, folder structure, and how they streamline deployment across various target environments in Databricks. You will also learn how to add variables, modify, validate, deploy, and execute Declarative Automation Bundles for multiple environments with different configurations using the Databricks CLI.


Finally, the course introduces Visual Studio Code as an Interactive Development Environment (IDE) for building, testing, and deploying Declarative Automation Bundles locally, optimizing your development process. The course concludes with an introduction to automating deployment pipelines using GitHub Actions to enhance the CI/CD workflow with Declarative Automation Bundles.


By the end of this course, you will be equipped to automate Databricks project deployments with Declarative Automation Bundles, improving efficiency through DevOps practices.


Languages Available: English | 日本語 | Português BR | 한국어

Skill Level
Professional
Duration
4h
Prerequisites

In this course, the content was developed for participants with these skills/knowledge/abilities:

• Strong knowledge of the Databricks platform, including experience with Databricks Workspaces, Apache Spark, Delta Lake, the Medallion Architecture, Unity Catalog, Delta Live Tables, and Workflows. In particular, knowledge of leveraging Expectations with DLTs. 

• Experience in data ingestion and transformation, with proficiency in PySpark for data processing and DataFrame manipulation. Candidates should also have experience writing intermediate-level SQL queries for data analysis and transformation.

• Proficiency in Python programming, including the ability to design and implement functions and classes, and experience with creating, importing, and utilizing Python packages.

• Familiarity with DevOps practices, particularly continuous integration and continuous delivery/deployment (CI/CD) principles.

• A basic understanding of Git version control.

• Prerequisite course DevOps Essentials for Data Engineering Course

Outline

DevOps and CI/CD Review

• DevOps Review

• Continuous Integration and Continuous Deployment/Delivery (CI/CD) Review

• Course Setup and Authentication


Deployment with Declarative Automation Bundles (DABs)

• Deploying Databricks Projects

• Introduction to Declarative Automation Bundles (DABs)

• Deploying a Simple DAB

• Deploy a Simple DAB

• Variable Substitutions in DABs

• Deploying a DAB to Multiple Environments

• Deploy a DAB to Multiple Environments

• DAB Project Templates Overview

• Use a Databricks Default DAB Template

• CI/CD Project Overview with DABs

• Continuous Integration and Continuous Deployment with DABs

• Adding ML to Engineering Workflows with DABs


Doing More with Databricks Asset Bundles

• Developing Locally with Visual Studio Code (VSCode)

• Using VSCode with Databricks

• CI/CD Best Practices for Data Engineering

• Next Steps: Automated Deployment with GitHub Actions

Upcoming Public Classes

Date
Time
Your Local Time
Language
Price
May 22
11 AM - 03 PM (Asia/Singapore)
-
English
$750.00
May 22
09 AM - 01 PM (America/New_York)
-
English
$750.00
Jun 25
08 AM - 12 PM (Asia/Kolkata)
-
English
$750.00
Jun 25
01 PM - 05 PM (Europe/London)
-
English
$750.00
Jul 22
01 PM - 05 PM (Australia/Sydney)
-
English
$750.00
Jul 22
09 AM - 01 PM (America/New_York)
-
English
$750.00

Public Class Registration

If your company has purchased success credits or has a learning subscription, please fill out the Training Request form. Otherwise, you can register below.

Private Class Request

If your company is interested in private training, please submit a request.

See all our registration options

Registration options

Databricks has a delivery method for wherever you are on your learning journey

Runtime

Self-Paced

Custom-fit learning paths for data, analytics, and AI roles and career paths through on-demand videos

Register now

Instructors

Instructor-Led

Public and private courses taught by expert instructors across half-day to two-day courses

Register now

Learning

Blended Learning

Self-paced and weekly instructor-led sessions for every style of learner to optimize course completion and knowledge retention. Go to Subscriptions Catalog tab to purchase

Purchase now

Scale

Skills@Scale

Comprehensive training offering for large scale customers that includes learning elements for every style of learning. Inquire with your account executive for details

Upcoming Public Classes

Data Engineer

DevOps Essentials for Data Engineering

This course explores software engineering best practices and DevOps principles, specifically designed for data engineers working with Databricks. Participants will build a strong foundation in key topics such as code quality, version control, documentation, and testing. The course emphasizes DevOps, covering core components, benefits, and the role of continuous integration and delivery (CI/CD) in optimizing data engineering workflows.

You will learn how to apply modularity principles in PySpark to create reusable components and structure code efficiently. Hands-on experience includes designing and implementing unit tests for PySpark functions using the pytest framework, followed by integration testing for Databricks data pipelines with Spark Declarative Pipeline and Jobs to ensure reliability.

The course also covers essential Git operations within Databricks, including using Databricks Git Folders to integrate continuous integration practices. Finally, you will take a high level look at various deployment methods for Databricks assets, such as REST API, CLI, SDK, and Declarative Automation Bundles (DABs), providing you with the knowledge of techniques to deploy and manage your pipelines.

By the end of the course, you will be proficient in software engineering and DevOps best practices, enabling you to build scalable, maintainable, and efficient data engineering solutions.

Languages Available: English | 日本語 | Português BR | 한국어 | Español | française

Paid
4h
Lab
instructor-led
Associate

Questions?

If you have any questions, please refer to our Frequently Asked Questions page.