Skip to main content

Delivery Specialization: CDW Migration Best Practices

This course is meant to provide guidance on best practices for executing a successful migration from Cloud Data Warehouses (CDWs) - e.g. Amazon Redshift, Snowflake, and Synapse Analytics - to Databricks. It covers a wide range of migration processes starting from initial discovery all the way through to delivery.

Skill Level
Professional
Duration
10h
Prerequisites

Databricks Platform Proficiencies:

• Configure & optimize SQL Warehouses, Jobs & Workflows

• Unity Catalog security, lineage & Delta Sharing

• Lakeflow bundles & serverless compute workflows

Real-World Experience

• ≥ 6 months building Spark SQL / PySpark ETL pipelines

• Deploy & monitor pipelines with Auto Loader & Delta Live Tables

• Performance tuning via Spark UI & cluster selection

Broader Domain Knowledge

• Medallion architecture (Bronze → Gold) & Delta Lake optimizations

• Dimensional & 3NF modeling; governed data sharing

• Cost‑aware design for multi‑cloud federation

Outline

1. Course Introduction

2. Discovery and Assessment

  • Why Migrate to Databricks?

  • Conducting Discovery Workshops

  • Architecture Assessment

3. Architecture Design and Planning

  • Architecture Feature Mapping

4. Data Warehouse Migration

  • Pre-Migration Considerations

  • Schema Migration

  • Data Migration

  • Other Database Object Migration

  • Governance & Security Migration

  • Best Practices

5. Code and ETL Pipelines Migration

  • Orchestration Migration

  • Code Migration

6. BI and Analytics Tools Integration

  • Downstream Tools and Cutovers

  • BI & Analytics Integration Execution

7. Migration Validation

  • DevSecOps

  • Validation Phases and Methods

  • Lakebridge

  • Lakebridge: Installation

8. Course Summary & Next Steps

Self-Paced

Custom-fit learning paths for data, analytics, and AI roles and career paths through on-demand videos

See all our registration options

Registration options

Databricks has a delivery method for wherever you are on your learning journey

Runtime

Self-Paced

Custom-fit learning paths for data, analytics, and AI roles and career paths through on-demand videos

Register now

Instructors

Instructor-Led

Public and private courses taught by expert instructors across half-day to two-day courses

Register now

Learning

Blended Learning

Self-paced and weekly instructor-led sessions for every style of learner to optimize course completion and knowledge retention. Go to Subscriptions Catalog tab to purchase

Purchase now

Scale

Skills@Scale

Comprehensive training offering for large scale customers that includes learning elements for every style of learning. Inquire with your account executive for details

Upcoming Public Classes

Get Started with Lakebase

This get started course introduces Databricks Lakebase, a fully managed PostgreSQL service built into the Databricks Data Intelligence Platform that brings operational (OLTP) and analytical (OLAP) workloads closer together.

The course begins with a conceptual lecture that compares OLTP and OLAP systems, explaining their different performance characteristics, storage models, and typical use cases. You will also explore the challenges organizations face when maintaining separate transactional databases and analytical platforms, including data movement, latency, and architectural complexity.

You will then learn how Databricks Lakebase helps address these challenges by providing a PostgreSQL-compatible operational database that integrates directly with the Databricks Lakehouse, enabling operational applications and analytics to work together within a unified platform.

Through hands-on labs, you will:

Create and explore a Lakebase project using autoscaling compute

• Navigate the Lakebase UI, including branching, monitoring, and configuration settings

• Create and query tables using the Lakebase SQL Editor

• Query Lakebase data from Databricks using Lakehouse Federation and foreign catalogs

• Perform Reverse ETL by synchronizing Delta tables to Lakebase

• Connect to Lakebase from Python and perform basic CRUD operations

This is a Get Started course, so the focus is on understanding the core concepts and basic workflows for working with Lakebase. Building full production applications on top of Lakebase is outside the scope of this course.

Note: For SCORM lecture files, please ensure that you close the SCORM window after completing the content. Do not click the ‘Next Lesson’ button, as doing so may prevent the SCORM module from being marked as complete.

Paid & Subscription
3h
Lab
Onboarding

Questions?

If you have any questions, please refer to our Frequently Asked Questions page.