How To Use Databricks SQL for Analytics on Your Lakehouse
On Demand
Type
- Session
Format
- Hybrid
Track
- Data Lakes, Data Warehouses and Data Lakehouses
Difficulty
- Intermediate
Room
- Moscone South | Upper Mezzanine | 160
Duration
- 80 min
Overview
Most organizations run complex cloud data architectures that silo applications, users, and data. As a result, most analysis is performed with stale data and there isn’t a single source of truth of data for analytics.
Join this interactive follow-along deep dive demo to learn how Databricks SQL allows you to operate a multicloud lakehouse architecture that delivers data warehouse performance at data lake economics — with up to 12x better price/performance than traditional cloud data warehouses. Now data analysts and scientists can work with the freshest and most complete data and quickly derive new insights for accurate decision-making.
Here’s what we’ll cover:
• Managing data access and permissions and monitoring how the data is being used and accessed in real time across your entire lakehouse infrastructure
• Configuring and managing compute resources for fast performance, low latency, and high user concurrency to your data lake
• Creating and working with queries, dashboards, query refresh, troubleshooting features and alerts
• Creating connections to third-party BI and database tools (Power BI, Tableau, DbVisualizer, etc.) so that you can query your lakehouse without making changes to your analytical and dashboarding workflows
Join this interactive follow-along deep dive demo to learn how Databricks SQL allows you to operate a multicloud lakehouse architecture that delivers data warehouse performance at data lake economics — with up to 12x better price/performance than traditional cloud data warehouses. Now data analysts and scientists can work with the freshest and most complete data and quickly derive new insights for accurate decision-making.
Here’s what we’ll cover:
• Managing data access and permissions and monitoring how the data is being used and accessed in real time across your entire lakehouse infrastructure
• Configuring and managing compute resources for fast performance, low latency, and high user concurrency to your data lake
• Creating and working with queries, dashboards, query refresh, troubleshooting features and alerts
• Creating connections to third-party BI and database tools (Power BI, Tableau, DbVisualizer, etc.) so that you can query your lakehouse without making changes to your analytical and dashboarding workflows
Session Speakers
See the best of Data+AI Summit
Watch on demand