Skip to main content

Get Started with Databricks Platform Administration

In this course, you will learn the basics of platform administration on the Databricks Data Intelligence Platform. It offers a comprehensive overview of the Unity Catalog, a vital component for effective data governance within Databricks environments. Divided into five modules, it begins with a detailed introduction to Databricks infrastructure and its data intelligence platform, including an in-depth walkthrough of the Databricks Workspace. You will explore data governance principles within Unity Catalog, covering its key concepts, architecture, and roles. The course further emphasizes managing Unity Catalog metastores and compute resources, including clusters and SQL warehouses. Finally, you'll master data access control by learning about privileges, fine-grained access, and how to govern data objects. By the end, you will be equipped with essential skills to administer the Unity Catalog to implement effective data governance, optimize compute resources, and enforce robust data security strategies. With the purchase of a Databricks Labs subscription, the course also closes out with a comprehensive lab exercise to practice what you’ve learned in a live Databricks Workspace environment.


Languages Available: English | 日本語 | Português BR | 한국어

Skill Level
Onboarding
Duration
3h
Prerequisites
The content was developed for participants with these skills/knowledge/abilities:

• Familiarity with the Databricks Data Intelligence Platform and basic workspace operations (create clusters, run code in notebooks, use basic notebook operations)

• Basic understanding of identity and access management concepts (users, groups, service principals, authentication, authorization)

• Understanding of Unity Catalog fundamentals including the hierarchical object model (metastore, catalogs, schemas, tables, volumes, models)

• Basic knowledge of data governance principles and access control concepts (permissions, entitlements, administrative responsibilities)

• Beginner familiarity with cloud computing concepts (virtual machines, object storage, identity management, cloud resources)

• Basic understanding of workspace administration concepts including user management and permission assignment

• Knowledge of account-level versus workspace-level administration and the relationship between them

• A basic understanding of cloud computing and SQL concepts, including networking, SQL queries, and database structures such as tables and views

• Familiarity with Python programming, the Jupyter notebook interface, and foundational PySpark operations

Self-Paced

Custom-fit learning paths for data, analytics, and AI roles and career paths through on-demand videos

See all our registration options

Registration options

Databricks has a delivery method for wherever you are on your learning journey

Runtime

Self-Paced

Custom-fit learning paths for data, analytics, and AI roles and career paths through on-demand videos

Register now

Instructors

Instructor-Led

Public and private courses taught by expert instructors across half-day to two-day courses

Register now

Learning

Blended Learning

Self-paced and weekly instructor-led sessions for every style of learner to optimize course completion and knowledge retention. Go to Subscriptions Catalog tab to purchase

Purchase now

Scale

Skills@Scale

Comprehensive training offering for large scale customers that includes learning elements for every style of learning. Inquire with your account executive for details

Upcoming Public Classes

Get Started with Lakebase

This get started course introduces Databricks Lakebase, a fully managed PostgreSQL service built into the Databricks Data Intelligence Platform that brings operational (OLTP) and analytical (OLAP) workloads closer together.

The course begins with a conceptual lecture that compares OLTP and OLAP systems, explaining their different performance characteristics, storage models, and typical use cases. You will also explore the challenges organizations face when maintaining separate transactional databases and analytical platforms, including data movement, latency, and architectural complexity.

You will then learn how Databricks Lakebase helps address these challenges by providing a PostgreSQL-compatible operational database that integrates directly with the Databricks Lakehouse, enabling operational applications and analytics to work together within a unified platform.

Through hands-on labs, you will:

Create and explore a Lakebase project using autoscaling compute

• Navigate the Lakebase UI, including branching, monitoring, and configuration settings

• Create and query tables using the Lakebase SQL Editor

• Query Lakebase data from Databricks using Lakehouse Federation and foreign catalogs

• Perform Reverse ETL by synchronizing Delta tables to Lakebase

• Connect to Lakebase from Python and perform basic CRUD operations

This is a Get Started course, so the focus is on understanding the core concepts and basic workflows for working with Lakebase. Building full production applications on top of Lakebase is outside the scope of this course.

Note: For SCORM lecture files, please ensure that you close the SCORM window after completing the content. Do not click the ‘Next Lesson’ button, as doing so may prevent the SCORM module from being marked as complete.

Paid & Subscription
3h
Lab
Onboarding

Questions?

If you have any questions, please refer to our Frequently Asked Questions page.