Only pay for what you use
No up-front costs. Only pay for the compute resources you use at per second granularity with simple pay-as-you-go pricing or committed-use discounts.
Standard One platform for your data analytics and ML workloads | Premium Data analytics and ML at scale across your business | ||
---|---|---|---|
Classic Compute | Jobs ComputeJobs Compute Photon Run data engineering pipelines to build data lakes and manage data at scale. | $0.15 / DBU | $0.30 / DBU |
Delta Live Tables Delta Live Tables Photon Easily build high quality streaming or batch ETL pipelines using Python or SQL with the DLT Edition that is best for your workload. Learn more | - | Starting at $0.30 / DBU | |
SQL Classic Run SQL queries for BI reporting, analytics and visualization to get timely insights from data lakes. Available in both Classic and Serverless (managed) Compute. Learn more | - | ||
All-Purpose ComputeAll-Purpose Compute Photon Run interactive data science and machine learning workloads. Also good for data engineering, BI and data analytics. | $0.40 / DBU | $0.55 / DBU | |
Workspace for production jobs, analytics, and ML | Workspace for production jobs, analytics, and ML | ||
Managed Apache Spark™ | |||
Optimized Delta Lake | |||
Cluster Autopilot | |||
Jobs Scheduling & Workflow | |||
Databricks SQL Workspace | |||
Databricks SQL Optimization | |||
Notebooks & Collaboration | |||
Connectors & Integration | |||
Databricks Runtime for ML | |||
Managed MLflow | |||
Up to 50x faster than Apache Spark™ | Autoscaling for optimized performance | ||
Optimized Runtime Engine | |||
Autoscaling | |||
Optimized Autoscaling | |||
Databricks Workspace administration | Audit logs & automated policy controls | ||
Administration Console | |||
Clusters for running production jobs | |||
Alerting and monitoring with retries | |||
Secure network architecture | Extend your cloud-native security for company-wide adoption | ||
Single Sign-On (SSO) | |||
VNET Injection | |||
Secure Cluster Connectivity | |||
One platform for your data analytics and ML workloads
Jobs ComputeJobs Compute Photon
Run data engineering pipelines to build data lakes and manage data at scale.
All-Purpose ComputeAll-Purpose Compute Photon
Run interactive data science and machine learning workloads. Also good for data engineering, BI and data analytics.
Compare compute optionsCalculate price
Workspace for production jobs, analytics, and ML
Optimized Delta Lake
Cluster Autopilot
Jobs Scheduling & Workflow
Notebooks & Collaboration
Connectors & Integration
Databricks Runtime for ML
Managed MLflow
Up to 50x faster than Apache Spark™
Autoscaling
Optimized Autoscaling
Databricks Workspace administration
Clusters for running production jobs
Alerting and monitoring with retries
Secure network architecture
VNET Injection
Secure Cluster Connectivity
Data analytics and ML at scale across your business
Jobs ComputeJobs Compute Photon
Run data engineering pipelines to build data lakes and manage data at scale.
Delta Live Tables Delta Live Tables Photon
Easily build high quality streaming or batch ETL pipelines using Python or SQL with the DLT Edition that is best for your workload. Learn more
Learn more about our extended time SQL promotion
SQL Classic
SQL Pro
Serverless SQL (preview)
Run SQL queries for BI reporting, analytics and visualization to get timely insights from data lakes. Available in both Classic and Serverless (managed) Compute. Learn more
All-Purpose ComputeAll-Purpose Compute Photon
Run interactive data science and machine learning workloads. Also good for data engineering, BI and data analytics.
Compare compute optionsCalculate price
Workspace for production jobs, analytics, and ML
Optimized Delta Lake
Cluster Autopilot
Jobs Scheduling & Workflow
Databricks SQL Workspace
Databricks SQL Optimization
Notebooks & Collaboration
Connectors & Integration
Databricks Runtime for ML
Managed MLflow
Autoscaling for optimized performance
Autoscaling
Optimized Autoscaling
Audit logs & automated policy controls
Clusters for running production jobs
Alerting and monitoring with retries
Unity Catalog (Cross-Workspace Data Governance)
Unity Catalog (Automated Data Lineage)
Managed Delta Sharing
Audit Logs
Cluster Policies
Extend your cloud-native security for company-wide adoption
VNET Injection
Secure Cluster Connectivity
Role-based Access Control
Azure AD credential passthrough
Token Management API
Customer Managed Keys
IP Access List
HIPAA Compliance

Pay as you go with a 14-day free trial or contact us for
commitment-based discounting or custom requirements.
The pricing shown above is for informational purposes for Azure Databricks services only. It does not include pricing for any other required Azure resources (e.g. compute instances).
A Databricks Unit (DBU) is a normalized unit of processing power on the Databricks Lakehouse Platform used for measurement and pricing purposes. The number of DBUs a workload consumes is driven by processing metrics, which may include the compute resources used and the amount of data processed. Please visit the Microsoft Azure Databricks pricing page for more details, including official pricing by instance type.
Customer success offerings
Databricks provides a range of customer success plans and support to
maximize your return on investment with realized impact.
Training
Building data and AI experts
Support
World-class production operations at scale
Professional services
Accelerating your business outcomes
Estimate your price
Use our comprehensive price calculator to estimate your Databricks pricing
for different workloads and the supported instance types.
Ready to get started?

