AWS Pricing
Databricks is deeply integrated with AWS security and data services to manage all your AWS data on a simple, open lakehouse

Only pay for what you use
No up-front costs. Only pay for the compute resources you use at per second granularity with simple pay-as-you-go pricing or committed-use discounts.
Standard One platform for your data analytics and ML workloads | Premium Data analytics and ML at scale across your business | Enterprise Data analytics and ML for your mission critical workloads | ||
---|---|---|---|---|
Classic Compute | Jobs Light Compute Run data engineering pipelines to build data lakes. | $0.07 / DBU | $0.10 / DBU | $0.13 / DBU |
Jobs Compute Jobs Compute Photon Run data engineering pipelines to build data lakes and manage data at scale. | $0.10 / DBU | $0.15 / DBU | $0.20 / DBU | |
Delta Live Tables Delta Live Tables Photon Easily build high quality streaming or batch ETL pipelines using Python or SQL with the DLT Edition that is best for your workload. Learn more | Starting at $0.20 / DBU | Starting at $0.20 / DBU | Starting at $0.20 / DBU | |
SQL Classic Run SQL queries for BI reporting, analytics and visualization to get timely insights from data lakes. Available in both Classic and Serverless (managed) Compute. Learn more | - | |||
All-Purpose ComputeAll-Purpose Compute Photon Run interactive data science and machine learning workloads. Also good for data engineering, BI and data analytics. | $0.40 / DBU | $0.55 / DBU | $0.65 / DBU | |
Add On Products | Enhanced Security and Compliance Provides enhanced security and controls for your compliance needs | - | - | 15% of Product Spend |
Workspace for production jobs, analytics, and ML | Workspace for production jobs, analytics, and ML | Workspace for production jobs, analytics, and ML | ||
Managed Apache Spark™ | ||||
Optimized Delta Lake | ||||
Cluster Autopilot | ||||
Jobs Scheduling & Workflow | ||||
Databricks SQL Workspace | ||||
Databricks SQL Optimization | ||||
Notebooks & Collaboration | ||||
Connectors & Integration | ||||
Databricks Runtime for ML | ||||
Managed MLflow | ||||
Up to 50x faster than Apache Spark™ | Autoscaling for optimized performance | Optimized performance | ||
Optimized Runtime Engine | ||||
Optimized Autoscaling | ||||
Databricks Workspace administration | Audit logs & automated policy controls | Audit logs & automated policy controls | ||
Administration Console | ||||
Unity Catalog (Cross-Workspace Data Governance) | ||||
Unity Catalog (Automated Data Lineage) | ||||
Managed Delta Sharing | ||||
Audit Logs | ||||
Cluster Policies | ||||
Secured cloud & network architecture with authentications like single sign-on | Extend your cloud-native security for company-wide adoption | Advanced compliance and security for mission critical data | ||
Single Sign-On (SSO) | ||||
Role-based Access Control | ||||
Federated IAM | ||||
Customer Managed VPC | ||||
Secure Cluster Connectivity | ||||
Token Management API | ||||
Customer Managed Keys | ||||
IP Access List | ||||
Enhanced Security Monitoring | ![]() | |||
HIPAA Compliance Controls | ![]() | |||
PCI-DSS Compliance Controls | ![]() | |||
FedRAMP-Moderate Compliance Controls | ![]() |
One platform for your data analytics and ML workloads
Jobs Light Compute
Run data engineering pipelines to build data lakes.
Jobs Compute Jobs Compute Photon
Run data engineering pipelines to build data lakes and manage data at scale.
Delta Live Tables Delta Live Tables Photon
Easily build high quality streaming or batch ETL pipelines using Python or SQL with the DLT Edition that is best for your workload. Learn more
SQL Classic
SQL Pro
Serverless SQL (preview)
Run SQL queries for BI reporting, analytics and visualization to get timely insights from data lakes. Available in both Classic and Serverless (managed) Compute. Learn more
All-Purpose ComputeAll-Purpose Compute Photon
Run interactive data science and machine learning workloads. Also good for data engineering, BI and data analytics.
Compare compute optionsCalculate price
Workspace for production jobs, analytics, and ML
Optimized Delta Lake
Cluster Autopilot
Jobs Scheduling & Workflow
Notebooks & Collaboration
Connectors & Integration
Databricks Runtime for ML
Managed MLflow
Up to 50x faster than Apache Spark™
Databricks Workspace administration
Secured cloud & network architecture with authentications like single sign-on
Data analytics and ML at scale across your business
Jobs Light Compute
Run data engineering pipelines to build data lakes.
Jobs Compute Jobs Compute Photon
Run data engineering pipelines to build data lakes and manage data at scale.
Delta Live Tables Delta Live Tables Photon
Easily build high quality streaming or batch ETL pipelines using Python or SQL with the DLT Edition that is best for your workload. Learn more
Learn more about our extended time SQL promotion
SQL Classic
SQL Pro
Serverless SQL (preview)
Run SQL queries for BI reporting, analytics and visualization to get timely insights from data lakes. Available in both Classic and Serverless (managed) Compute. Learn more
All-Purpose ComputeAll-Purpose Compute Photon
Run interactive data science and machine learning workloads. Also good for data engineering, BI and data analytics.
Compare compute optionsCalculate price
Workspace for production jobs, analytics, and ML
Optimized Delta Lake
Cluster Autopilot
Jobs Scheduling & Workflow
Databricks SQL Workspace
Databricks SQL Optimization
Notebooks & Collaboration
Connectors & Integration
Databricks Runtime for ML
Managed MLflow
Autoscaling for optimized performance
Optimized Autoscaling
Audit logs & automated policy controls
Unity Catalog (Cross-Workspace Data Governance)
Unity Catalog (Automated Data Lineage)
Managed Delta Sharing
Audit Logs
Cluster Policies
Extend your cloud-native security for company-wide adoption
Role-based Access Control
Federated IAM
Customer Managed VPC
Secure Cluster Connectivity
Token Management API
Data analytics and ML for your mission critical workloads
Jobs Light Compute
Run data engineering pipelines to build data lakes.
Jobs Compute Jobs Compute Photon
Run data engineering pipelines to build data lakes and manage data at scale.
Delta Live Tables Delta Live Tables Photon
Easily build high quality streaming or batch ETL pipelines using Python or SQL with the DLT Edition that is best for your workload. Learn more
Learn more about our extended time SQL promotion
SQL Classic
SQL Pro
Serverless SQL (preview)
Run SQL queries for BI reporting, analytics and visualization to get timely insights from data lakes. Available in both Classic and Serverless (managed) Compute. Learn more
All-Purpose ComputeAll-Purpose Compute Photon
Run interactive data science and machine learning workloads. Also good for data engineering, BI and data analytics.
Compare compute optionsCalculate price
Enhanced Security and Compliance
Provides enhanced security and controls for your compliance needs
Workspace for production jobs, analytics, and ML
Optimized Delta Lake
Cluster Autopilot
Jobs Scheduling & Workflow
Databricks SQL Workspace
Databricks SQL Optimization
Notebooks & Collaboration
Connectors & Integration
Databricks Runtime for ML
Managed MLflow
Enhanced Security Monitoring1
HIPAA Compliance Controls1
PCI-DSS Compliance Controls1
FedRAMP-Moderate Compliance Controls1
Optimized performance
Optimized Autoscaling
Enhanced Security Monitoring1
HIPAA Compliance Controls1
PCI-DSS Compliance Controls1
FedRAMP-Moderate Compliance Controls1
Audit logs & automated policy controls
Unity Catalog (Cross-Workspace Data Governance)
Unity Catalog (Automated Data Lineage)
Managed Delta Sharing
Audit Logs
Cluster Policies
Enhanced Security Monitoring1
HIPAA Compliance Controls1
PCI-DSS Compliance Controls1
FedRAMP-Moderate Compliance Controls1
Advanced compliance and security for mission critical data
Role-based Access Control
Federated IAM
Customer Managed VPC
Secure Cluster Connectivity
Token Management API
Customer Managed Keys
IP Access List
Enhanced Security Monitoring1
HIPAA Compliance Controls1
PCI-DSS Compliance Controls1
FedRAMP-Moderate Compliance Controls1

1Available as Add-on
Pay as you go with a 14-day free trial or contact us for committed-use discounts or custom requirements (e.g., dedicated deployments like Private Cloud).
The pricing is for the Databricks platform only. It does not include pricing for any required AWS resources (e.g., compute instances).
A Databricks Unit (DBU) is a unit of processing capability per hour, billed on a per second usage. View the types of supported instances.
Customer success offerings
Databricks provides a range of customer success plans and support to maximize your return on investment with realized impact.
Support
World-class production operations at scale
Estimate your price
Use our comprehensive price calculator to estimate your cost for different Databricks workloads and the types of supported instances.
AWS pricing FAQ
A Databricks Unit (DBU) is a normalized unit of processing power on the Databricks Lakehouse Platform used for measurement and pricing purposes. The number of DBUs a workload consumes is driven by processing metrics which may include the compute resources used and the amount of data processed. For example, 1 DBU is the equivalent of Databricks running on an i3.xlarge machine with the Databricks 8.1 standard runtime for an hour. See the full list of supported instances and details.
Jobs workloads are workloads running on Jobs clusters. Jobs clusters are clusters that are both started and terminated by the same Job. Only one job can be run on a Jobs cluster for isolation purposes. All-Purpose workloads are workloads running on All-Purpose clusters. All-Purpose clusters are clusters that are not classified as Jobs clusters. They can be used for various purposes such as running commands within Databricks notebooks, connecting via JDBC/ODBC for BI workloads, running MLflow experiments on Databricks. Multiple users can share an All-Purpose cluster for doing interactive analysis in a collaborative way.
For Classic compute, Databricks deploys cluster resources into your AWS VPC and you are responsible for paying for EC2 charges. For Serverless compute, Databricks deploys the cluster resources into a VPC in Databricks’ AWS account and you are not required to separately pay for EC2 charges. Please see here for more details.
If your source data is in a different AWS cloud region than the Databricks Serverless environment, AWS may charge you network egress charges. Databricks is currently waiving charges for egress from the Serverless environment to your destination region, but we may charge for such egress at market-competitive rates in the future.
Jobs Light cluster is Databricks’ equivalent of open-source Apache Spark. It targets simple, non-critical workloads that don’t need the performance, reliability, or autoscaling benefits provided by Databricks’ proprietary technologies. In comparison, the Jobs cluster provides you with all of the aforementioned benefits to boost your team productivity and reduce your total cost of ownership.
The 14-day free trial gives you access to either Standard or Premium feature sets depending on your choice of the plan. Contact us if you are interested in Databricks Enterprise or Dedicated plan for custom deployment and other enterprise customizations. Please note that you will still be charged by your cloud provider for resources (e.g. compute instances) used within your account during the free trial.
At the end of the trial, you are automatically subscribed to the plan that you have been on during the free trial. You can cancel your subscription at any time.
Databricks Community Edition is a free, limited functionality platform designed for anyone who wants to learn Spark. Sign up here.
By default, you will be billed monthly based on per-second usage on your credit card. Contact us for more billing options, such as billing by invoice or an annual plan.
We offer technical support with our annual commitments. For self-serve options customers are encouraged to also check the technical documentation. Contact us to learn more.
You must contact us for a HIPAA-compliant deployment. Please note that prior to processing any PHI data in Databricks, a signed business associate agreement (BAA) must be in place between your organization and (a) Databricks, Inc.; and (b) because you must have your own account with AWS to deploy Databricks on AWS, Amazon Web Services. Please see here for more details.
Please contact us to get access to preview features.
Product Spend is calculated based on AWS product spend at list, before the application of any discounts, usage credits, add-on uplifts, or support fees.
Ready to get started?

