Maximizing Value From Your Data with Lakehouse AI
Overview
Ready to get started with lakehouses, especially around DS/ML? This session is for you.
The journey from training models to taking them to production can be quite challenging. Teams face a large variety of issues: siloed data, inconsistent tools, complex infrastructure, and a lack of visibility into their model performance. Learn how the Databricks Lakehouse Platform provides a unified, data-centric ML environment that accelerates and simplifies machine learning lifecycle, with a standardized set of tools, frameworks and governance used across your lakehouse data.
In this session, you’ll learn how about lakehouses, and how to:
- Ingest, prepare and process data on a platform designed to handle production-scale ML training, including LLMs
- Leverage data science notebooks and MLflow to manually train and track your ML experiments or let AutoML do the experimentation for you
- Use real-time serving with fast autoscaling to save cost and maintain SLAs
- Monitor your deployed models for drift and accuracy
- Manage and govern all data and ML assets with Unity Catalog
Type
- Breakout
Experience
- In Person, Virtual
Track
- DSML: Production ML / MLOps, Databricks Experience (DBX)
Difficulty
- Intermediate
Duration
- 40 min
Don't miss this year's event!
Register now