Session
Hands-on Learning: AI-Powered Data Engineering with Lakeflow: Techniques for Modern Data Professionals (repeat)
Overview
Experience | In Person |
---|---|
Type | Hands-on Learning |
Track | Data Engineering and Streaming |
Industry | Enterprise Technology, Health and Life Sciences, Financial Services |
Technologies | Databricks Workflows, DLT, LakeFlow |
Skill Level | Beginner |
Duration | 90 min |
This introductory workshop caters to data engineers seeking hands-on experience and data architects looking to deepen their knowledge. The workshop is structured to provide a solid understanding of the following data engineering and streaming concepts:
- Introduction to Lakeflow and the Data Intelligence Platform
- Getting started with DLT for declarative data pipelines in SQL using Streaming Tables and Materialized Views
- Mastering Databricks Workflows with advanced control flow and triggers
- Understanding serverless compute
- Data governance and lineage with Unity Catalog
- Generative AI for Data Engineers: Genie and Databricks Assistant
We believe you can only become an expert if you work on real problems and gain hands-on experience. Therefore, we will equip you with your own lab environment in this workshop and guide you through practical exercises like using GitHub, ingesting data from various sources, creating batch and streaming data pipelines, and more.
Session Speakers
Frank Munz
/TMM Principal
Databricks