Skip to main content

Generative AI Engineering with Databricks

This course is aimed at data scientists, machine learning engineers, and other data practitioners who want to build generative AI applications using the latest and most popular frameworks and Databricks capabilities. 


Note: Databricks Academy is transitioning to a notebook-based format for classroom sessions within the Databricks environment, discontinuing the use of slide decks for lectures in the first three modules. You can access the lecture notebooks in the Vocareum lab environment.


Below, we describe each of the four, four-hour modules included in this course.

Building Retrieval Agents On DatabricksThis course provides hands-on training for building retrieval agents on the Databricks Data Intelligence Platform. Participants will learn to parse unstructured documents into structured data, transform and chunk content for retrieval workflows, build vector search solutions for document retrieval, and develop production-ready agents using MLflow and Agent Bricks. The course covers the complete agent lifecycle from document processing through embedding generation, vector indexing, and agent deployment with governance capabilities.


Building Single-Agent Applications on DatabricksThis course provides hands-on training for building single-agent applications on the Databricks Data Intelligence Platform. Students will learn to create AI agents that leverage Unity Catalog functions as tools, implement comprehensive tracing and monitoring with MLflow, and deploy agents using both traditional frameworks like LangChain and modern solutions like Agent Bricks. The course covers the complete agent lifecycle from initial tool creation and testing in AI Playground through production deployment with governance, evaluation, and continuous improvement capabilities.


Agent Evaluation on DatabricksThis course teaches students how to systematically evaluate AI agents using MLflow's evaluation framework, addressing the unique challenges of non-deterministic AI systems that traditional software testing cannot handle. Students learn to implement various evaluation approaches including built-in judges for common criteria like correctness and safety, guideline judges for business-specific requirements, and custom judges for specialized needs. The course covers both offline evaluation using curated datasets and online production monitoring, with hands-on experience using MLflow's tracing capabilities to understand agent execution patterns and collect human feedback from different stakeholder types. Through practical demonstrations and labs, students develop skills in creating evaluation workflows that drive continuous quality improvements throughout the AI agent development lifecycle.


Generative AI Application Deployment and Monitoring: Ready to learn how to deploy, operationalize, and monitor generative deploying, operationalizing, and monitoring generative AI applications? This module will help you gain skills in the deployment of generative AI applications using tools like Model Serving. We’ll also cover how to operationalize generative AI applications following best practices and recommended architectures. Finally, we’ll discuss the idea of monitoring generative AI applications and their components using Lakehouse Monitoring.

Skill Level
Associate
Duration
16h
Prerequisites

• Ability to write production-quality Python code, including OOP, exception handling, decorators, type hints, and proper documentation.

• Experience writing advanced SQL SELECT queries, handling data types and NULL values, and creating reusable, well-documented SQL functions.

• Comfort navigating the Databricks workspace and notebooks, managing compute, using Catalog Explorer, and understanding Databricks-managed services.

• Understanding of LLM behavior, basic prompt engineering, RAG concepts, agent reasoning, and working with REST APIs and JSON payloads.

• Basic familiarity with MLflow, agent frameworks (e.g., LangChain), and recommended Databricks training such as AI Agents Fundamentals.

• Familiarity with natural language processing concepts

• Familiarity with prompt engineering/prompt engineering best practices 

• Familiarity with the Databricks Data Intelligence Platform

• Familiarity with RAG  (preparing data, building a RAG architecture, concepts like embedding, vectors, vector databases, etc.)

• Experience with building LLM applications using multi-stage reasoning LLM chains and agents

• Experience with Databricks Data Intelligence Platform tools for evaluation and governance. 

• Understanding of Unity Catalog concepts including catalogs and schemas

• Basic knowledge of MLflow

Outline

Building Retrieval Agents On Databricks

• Document Parsing and Chunking

• Vector Search for Retrieval

• Building and Logging Retrieval Agents

• Agent Bricks


Building Single-Agent Applications on Databricks

• Foundations of Agents

• Building Single Agents

• Reproducible Agents

• Production-Ready Agents with Agent Bricks


Agent Evaluation on Databricks

• AI Agent Evaluation Fundamentals

• Built-In and Guideline Judges

• Custom Judges and Human Feedback


Generative AI Application Deployment and Monitoring

• Model Deployment Fundamentals

• Batch Deployment

• Real-Time Deployment

• AI System Monitoring

• LLMOps Concepts

Upcoming Public Classes

Date
Time
Your Local Time
Language
Price
May 26 - 29
01 PM - 05 PM (Europe/London)
-
English
$1500.00
Jun 09 - 12
11 AM - 03 PM (Asia/Singapore)
-
English
$1500.00
Jun 09 - 10
09 AM - 05 PM (Europe/London)
-
English
$1500.00
Jun 09 - 12
08 AM - 12 PM (America/Los_Angeles)
-
English
$1500.00
Jun 23 - 26
01 PM - 05 PM (Europe/London)
-
English
$1500.00
Jul 28 - 31
11 AM - 03 PM (Asia/Singapore)
-
English
$1500.00
Jul 28 - 31
01 PM - 05 PM (Europe/London)
-
English
$1500.00
Jul 28 - 31
08 AM - 12 PM (America/Los_Angeles)
-
English
$1500.00
Jul 30 - 31
09 AM - 05 PM (Europe/London)
-
English
$1500.00

Public Class Registration

If your company has purchased success credits or has a learning subscription, please fill out the Training Request form. Otherwise, you can register below.

Private Class Request

If your company is interested in private training, please submit a request.

See all our registration options

Registration options

Databricks has a delivery method for wherever you are on your learning journey

Runtime

Self-Paced

Custom-fit learning paths for data, analytics, and AI roles and career paths through on-demand videos

Register now

Instructors

Instructor-Led

Public and private courses taught by expert instructors across half-day to two-day courses

Register now

Learning

Blended Learning

Self-paced and weekly instructor-led sessions for every style of learner to optimize course completion and knowledge retention. Go to Subscriptions Catalog tab to purchase

Purchase now

Scale

Skills@Scale

Comprehensive training offering for large scale customers that includes learning elements for every style of learning. Inquire with your account executive for details

Upcoming Public Classes

Machine Learning Practitioner

Advanced Machine Learning with Databricks

This course is aimed at data scientists and machine learning practitioners and consists of two, four-hours modules. 

Machine Learning at Scale

In this course, you will gain theoretical and practical knowledge of Apache Spark’s architecture and its application to machine learning workloads within Databricks. You will learn when to use Spark for data preparation, model training, and deployment, while also gaining hands-on experience with Spark ML and pandas APIs on Spark. This course will introduce you to advanced concepts like hyperparameter tuning and scaling Optuna with Spark. This course will use features and concepts introduced in the associate course such as MLflow and Unity Catalog for comprehensive model packaging and governance.

Advanced Machine Learning Operations

In this course, you will be provided with a comprehensive understanding of the machine learning lifecycle and MLOps, emphasizing best practices for data and model management, testing, and scalable architectures. It covers key MLOps components, including CI/CD, pipeline management, and environment separation, while showcasing Databricks’ tools for automation and infrastructure management, such as Databricks Asset Bundles (DABs), Workflows, and Mosaic AI Model Serving. You will learn about monitoring, custom metrics, drift detection, model rollout strategies, A/B testing, and the principles of reliable MLOps systems, providing a holistic view of implementing and managing ML projects in Databricks.

Paid
8h
Lab
instructor-led
Professional
Data Engineer

Advanced Data Engineering with Databricks

This course serves as an appropriate entry point to learn Advanced Data Engineering with Databricks. 

Note: Databricks Academy is transitioning to a notebook-based format for classroom sessions within the Databricks environment, discontinuing the use of slide decks for lectures in the first module. You can access the lecture notebooks in the Vocareum lab environment.

Below, we describe each of the four, four-hour modules included in this course.

Advanced Techniques with Spark Declarative Pipelines

This course explores Databricks' Lakeflow Spark Declarative Pipelines (SDP) for building production-grade streaming pipelines. You will learn advanced design patterns, robust data quality enforcement, and cross-platform integration essential for real-world lakehouse engineering.

Throughout the course, you will dive into modern data ingestion and processing techniques, mastering tools like Liquid Clustering for layout optimization and the Multiplex Streaming pattern for mixed-schema events. By the end of the modules, you will know how to confidently handle schema evolution, automate Change Data Capture (CDC), and ensure data integrity.

Through lectures and hands-on demos, you will:

• Build multi-flow pipelines to ingest multi-source data into a unified Bronze table.

• Apply Liquid Clustering and Data Quality Expectations across Silver and Gold layers.

• Implement the Multiplex pattern with Iceberg UniForm for cross-platform data access.

• Automate SCD Type 2 history tracking using AUTO CDC INTO.

• Design zero-data-loss quarantine pipelines to audit and manage invalid records.

Databricks Data Privacy

This content is intended for the learner persona of data engineers or for customers, partners, and employees who complete data engineering tasks with Databricks. It aims to provide them with the necessary knowledge and skills to execute these activities effectively on the Databricks platform.

Databricks Performance Optimization

In this course, you’ll learn how to optimize workloads and physical layout with Spark and Delta Lake and and analyze the Spark UI to assess performance and debug applications. We’ll cover topics like streaming, liquid clustering, data skipping, caching, photons, and more.

Automated Deployment with Declarative Automation Bundles

This course provides a comprehensive review of DevOps principles and their application to Databricks projects. It begins with an overview of core DevOps, DataOps, continuous integration (CI), continuous deployment (CD), and testing, and explores how these principles can be applied to data engineering pipelines.

The course then focuses on continuous deployment within the CI/CD process, examining tools like the Databricks REST API, SDK, and CLI for project deployment. You will learn about Declarative Automation Bundles (DABs) and how they fit into the CI/CD process. You’ll dive into their key components, folder structure, and how they streamline deployment across various target environments in Databricks. You will also learn how to add variables, modify, validate, deploy, and execute Declarative Automation Bundles for multiple environments with different configurations using the Databricks CLI.

Finally, the course introduces Visual Studio Code as an Interactive Development Environment (IDE) for building, testing, and deploying Declarative Automation Bundles locally, optimizing your development process. The course concludes with an introduction to automating deployment pipelines using GitHub Actions to enhance the CI/CD workflow with Declarative Automation Bundles.

By the end of this course, you will be equipped to automate Databricks project deployments with Declarative Automation Bundles, improving efficiency through DevOps practices.

Languages Available: English | 日本語 | Português BR | 한국어

Paid
16h
Lab
instructor-led
Professional

Questions?

If you have any questions, please refer to our Frequently Asked Questions page.