Skip to main content

Data Engineering with Databricks

Languages Available: English | 日本語Português BR | 한국어


Data professionals from all walks of life will benefit from this comprehensive introduction to the components of        the Databricks Lakehouse Platform that directly support putting ETL pipelines into production. You will leverage SQL and Python to define and schedule pipelines that incrementally process new data from a variety of data sources to power analytic  applications and dashboards in the Lakehouse.  This course offers hands-on instruction in Databricks Data Science & Engineering Workspace, Databricks SQL, Delta Live Tables, Databricks Repos, Databricks Task Orchestration, and the Unity Catalog.

Skill Level
Associate
Duration
16h
Prerequisites
  • Basic knowledge of SQL query syntax, including writing queries using SELECT, WHERE, GROUP BY, ORDER BY, LIMIT, and JOIN
  • Basic knowledge of SQL DDL statements to create, alter, and drop databases and tables
  • Basic knowledge of SQL DML statements, including DELETE, INSERT, UPDATE, and MERGE
  • Experience with or knowledge of data engineering practices on cloud platforms, including cloud features such as virtual machines, object storage, identity management, and metastores
  • Basic familiarity with Python variables, functions, and control flow (preferred)

Outline

Day 1

  • Introduction to Databricks Lakehouse Platform, Workspace, and Services
  • Delta Lake
  • Relational entities on Databricks
  • ETL with Spark SQL
  • Just enough Python for Spark SQL
  • Incremental data processing with Structured Streaming and Auto Loader


   Day 2

  • Medallion architecture in the data lakehouse
  • Delta Live Tables
  • Task orchestration with Databricks Jobs
  • Databricks SQL
  • Managing Permissions in the lakehouse
  • Productionizing dashboards and queries on Databricks SQL

Upcoming Public Classes

Date
Time
Language
Price
May 13 - 16
01 PM - 05 PM (Australia/Sydney)
English
$1500.00
May 13 - 14
09 AM - 05 PM (Europe/Paris)
English
$1500.00
May 13 - 16
09 AM - 01 PM (America/New_York)
English
$1500.00
May 20 - 21
09 AM - 05 PM (Asia/Kolkata)
English
$1500.00
May 23 - 24
09 AM - 05 PM (America/Los_Angeles)
English
$1500.00
May 27 - 28
09 AM - 05 PM (Europe/Paris)
English
$1500.00
Jun 03 - 06
01 PM - 05 PM (Asia/Kolkata)
English
$1500.00
Jun 04 - 05
09 AM - 05 PM (Europe/Paris)
English
$1500.00
Jun 10 - 11
09 AM - 05 PM (Europe/London)
English
$1500.00
Jun 10 - 13
09 AM - 01 PM (America/New_York)
English
$1500.00
Jun 17 - 20
01 PM - 05 PM (Australia/Sydney)
English
$1500.00
Jun 17 - 18
09 AM - 05 PM (Asia/Kolkata)
English
$1500.00
Jun 17 - 18
09 AM - 05 PM (Europe/Paris)
English
$1500.00
Jun 17 - 21
09 AM - 01 PM (America/New_York)
English
$1500.00
Jun 18 - 21
02 PM - 06 PM (America/New_York)
English
$1500.00
Jun 24 - 25
09 AM - 05 PM (America/New_York)
English
$1500.00
Jul 01 - 02
10 AM - 06 PM (Asia/Singapore)
English
$1500.00
Jul 01 - 02
09 AM - 05 PM (America/Chicago)
English
$1500.00
Jul 02 - 03
09 AM - 05 PM (Europe/Paris)
English
$1500.00
Jul 08 - 09
09 AM - 05 PM (Asia/Kolkata)
English
$1500.00
Jul 08 - 09
09 AM - 05 PM (Europe/London)
English
$1500.00
Jul 15 - 16
09 AM - 05 PM (Australia/Sydney)
English
$1500.00
Jul 15 - 16
09 AM - 05 PM (America/New_York)
English
$1500.00
Jul 22 - 23
10 AM - 06 PM (Asia/Singapore)
English
$1500.00
Jul 22 - 23
08 AM - 04 PM (America/Chicago)
English
$1500.00
Jul 24 - 25
09 AM - 05 PM (Europe/London)
English
$1500.00
Jul 29 - 30
09 AM - 05 PM (Asia/Kolkata)
English
$1500.00
Jul 29 - 30
09 AM - 05 PM (Europe/Paris)
English
$1500.00
Aug 05 - 06
09 AM - 05 PM (America/New_York)
English
$1500.00
Aug 12 - 13
10 AM - 06 PM (Asia/Singapore)
English
$1500.00
Aug 12 - 13
08 AM - 04 PM (America/Chicago)
English
$1500.00
Aug 19 - 20
09 AM - 05 PM (Asia/Kolkata)
English
$1500.00
Aug 26 - 27
09 AM - 05 PM (America/New_York)
English
$1500.00

Public Class Registration

If your company has purchased success credits or has a learning subscription, please fill out the Training Request form. Otherwise, you can register below.

Private Class Request

If your company is interested in private training, please submit a request.

See all our registration options

Registration options

Databricks has a delivery method for wherever you are on your learning journey

Runtime

Self-Paced

Custom-fit learning paths for data, analytics, and AI roles and career paths through on-demand videos

Register now

Instructors

Instructor-Led

Public and private courses taught by expert instructors across half-day to two-day courses

Register now

Learning

Blended Learning

Self-paced and weekly instructor-led sessions for every style of learner to optimize course completion and knowledge retention. Go to Subscriptions Catalog tab to purchase

Purchase now

Scale

Skills@Scale

Comprehensive training offering for large scale customers that includes learning elements for every style of learning. Inquire with your account executive for details

Upcoming Public Classes

Data Engineer

Data Pipelines with Delta Live Tables

In this course, you'll use Delta Live Tables with your choice of Spark SQL or Python to define and schedule pipelines that incrementally process new data from a variety of data sources into the Lakehouse. Learning objectives Describe how Delta Live Tables tracks data dependencies in data pipelines. Configure and run data pipelines using the Delta Live Tables UI. Use Python or Spark SQL to define data pipelines that ingest and process data through multiple tables in the lakehouse using Auto Loader and Delta Live Tables. Use APPLY CHANGES INTO syntax to process Change Data Capture feeds. Review event logs and data artifacts created by pipelines and troubleshoot DLT syntaxPrerequisites Beginner familiarity with cloud computing concepts (virtual machines, object storage, etc.) Ability to perform basic code development tasks using the Databricks Data Engineering & Data Science workspace (create clusters, run code in notebooks, use basic notebook operations, import repos from git, etc) Beginning programming experience with Delta Lake,Use Delta Lake DDL to create tables, compact files, restore previous table versions, and perform garbage collection of tables in the Lakehouse.Use CTAS to store data derived from a query in a Delta Lake table.Use SQL to perform complete and incremental updates to existing tables. Beginner programming experience with Python (syntax, conditions, loops, functions) Beginning programming experience with Spark SQL or PySpark. Extract data from a variety of file formats and data sources. Apply a number of common transformations to clean data. Reshape and manipulate complex data using advanced built-in functions. Production experience working with data warehouses and data lakes. Last course update April 2023
Paid
4h
Lab
instructor-led
Associate
Career Workshop

Career Workshop/

March 20

Careers at Databricks

We're on a mission to help data teams solve the world's toughest problems. Will you join us?
Advance my career now

Questions?

If you have any questions, please refer to our Frequently Asked Questions page.