Lakehouse with Delta Lake Deep Dive
Overview
Lakehouse with Delta Lake Deep Dive
• Audience: Technical
• Duration: 1 half-day
• Hands-on labs: No
Description: In this course, we will provide a brief overview of data architecture concepts, an introduction to the Lakehouse paradigm, and an in-depth look at Delta Lake features and functionality. You will learn about applying software engineering principles with Databricks as we demonstrate how to build end-to-end OLAP data pipelines using Delta Lake for batch and streaming data. The course also discusses serving data to end users through aggregate tables and Databricks SQL Analytics. Throughout the course, emphasis will be placed on using data engineering best practices with Databricks.
By the end of the course, you will be able to:
• Identify the core components of Delta Lake that make a Lakehouse possible.
• Define commonly used optimizations available in Delta Engine.
• Build end-to-end batch and streaming OLAP data pipeline using Delta Lake.
• Make data available for consumption by downstream stakeholders using specified design patterns.
• Document data at the table level to promote data discovery and cross-team communication.
• Apply Databricks’ recommended best practices in engineering a single source of truth Delta architecture.
Prerequisites:
• Familiarity with data engineering concepts
• Basic knowledge of Delta Lake core features and use cases
Type
- Training
Format
- Virtual
Track
- Training
Room
- Virtual Room 2 - 3000
Price
- $0
Duration
- Half-Day
See the best of Data+AI Summit
Watch on demand