
Only pay for what you use
선불로 들어가는 비용은 없습니다. 간단한 종량제 요금이나 약정 할인으로 시간당(초) 사용하는 컴퓨팅 리소스에 대해서만 요금을 지불하면 됩니다.
Standard One platform for your data analytics and ML workloads | Premium Data analytics and ML at scale across your business | Enterprise Data analytics and ML for your mission critical workloads | ||
---|---|---|---|---|
Classic Compute | Jobs Light Compute Run data engineering pipelines to build data lakes. | $0.07 / DBU | $0.10 / DBU | $0.13 / DBU |
Jobs Compute Jobs Compute Photon Run data engineering pipelines to build data lakes and manage data at scale. | $0.10 / DBU | $0.15 / DBU | $0.20 / DBU | |
Delta Live Tables Delta Live Tables Photon Easily build high quality streaming or batch ETL pipelines using Python or SQL with the DLT Edition that is best for your workload. Learn more | $0.20 - $0.36 / DBU | $0.20 - $0.36 / DBU | $0.20 - $0.36 / DBU | |
SQL Compute Run SQL queries for BI reporting, analytics and visualization to get timely insights from data lakes. | - | $0.22 / DBU | $0.22 / DBU | |
All-Purpose ComputeAll-Purpose Compute Photon Run interactive data science and machine learning workloads. Also good for data engineering, BI and data analytics. | $0.40 / DBU | $0.55 / DBU | $0.65 / DBU | |
Serverless Compute | Serverless SQL Compute (Preview) Serverless SQL provides instant and managed compute that is hosted in Databricks cloud provider account. |
- | $0.70 / DBU (Compute Infrastructure included; network egress charges may apply) | $0.70 / DBU (Compute Infrastructure included; network egress charges may apply) |
Add On Products | Enhanced Security and Compliance Provides enhanced security and controls for your compliance needs | - | - | 15% of Product Spend (FREE during Preview) |
Workspace for production jobs, analytics, and ML | Workspace for production jobs, analytics, and ML | Workspace for production jobs, analytics, and ML | ||
Managed Apache Spark™ | ||||
Optimized Delta Lake | ||||
Cluster Autopilot | ||||
Jobs Scheduling & Workflow | ||||
Databricks SQL Workspace | ||||
Databricks SQL Optimization | ||||
Notebooks & Collaboration | ||||
Connectors & Integration | ||||
Databricks Runtime for ML | ||||
관리형 MLFlow | ||||
Up to 50x faster than Apache Spark™ | Autoscaling for optimized performance | Optimized performance | ||
Optimized Runtime Engine | ||||
Optimized Autoscaling | ||||
Databricks Workspace administration | Audit logs & automated policy controls | Audit logs & automated policy controls | ||
Administration Console | ||||
감사 로그 | ||||
Cluster Policies | ||||
Secured cloud & network architecture with authentications like single sign-on | Extend your cloud-native security for company-wide adoption | Advanced compliance and security for mission critical data | ||
Single Sign-On (SSO) | ||||
역할 기반 액세스 제어 | ||||
Federated IAM | ||||
Customer Managed VPC | ||||
Secure Cluster Connectivity | ||||
Token Management API | ||||
Customer Managed Keys | ||||
IP Access List | ||||
Enhanced Security Monitoring | ![]() | |||
HIPAA Compliance Controls | ![]() | |||
PCI-DSS Compliance Controls | ![]() | |||
FedRAMP-Moderate Compliance Controls | ![]() |
One platform for your data analytics and ML workloads
Jobs Light Compute
Run data engineering pipelines to build data lakes.
Jobs Compute Jobs Compute Photon
Run data engineering pipelines to build data lakes and manage data at scale.
Delta Live Tables Delta Live Tables Photon
Easily build high quality streaming or batch ETL pipelines using Python or SQL with the DLT Edition that is best for your workload. Learn more
All-Purpose ComputeAll-Purpose Compute Photon
Run interactive data science and machine learning workloads. Also good for data engineering, BI and data analytics.
Compare compute optionsCalculate price
Workspace for production jobs, analytics, and ML
Optimized Delta Lake
Cluster Autopilot
Jobs Scheduling & Workflow
Notebooks & Collaboration
Connectors & Integration
Databricks Runtime for ML
관리형 MLFlow
Up to 50x faster than Apache Spark™
Databricks Workspace administration
Secured cloud & network architecture with authentications like single sign-on
Data analytics and ML at scale across your business
Jobs Light Compute
Run data engineering pipelines to build data lakes.
Jobs Compute Jobs Compute Photon
Run data engineering pipelines to build data lakes and manage data at scale.
Delta Live Tables Delta Live Tables Photon
Easily build high quality streaming or batch ETL pipelines using Python or SQL with the DLT Edition that is best for your workload. Learn more
SQL Compute
Run SQL queries for BI reporting, analytics and visualization to get timely insights from data lakes.
All-Purpose ComputeAll-Purpose Compute Photon
Run interactive data science and machine learning workloads. Also good for data engineering, BI and data analytics.
Compare compute optionsCalculate price
Serverless SQL Compute (Preview)
Serverless SQL provides instant and managed compute that is hosted in Databricks cloud provider account.
Compare compute optionsCalculate price
Workspace for production jobs, analytics, and ML
Optimized Delta Lake
Cluster Autopilot
Jobs Scheduling & Workflow
Databricks SQL Workspace
Databricks SQL Optimization
Notebooks & Collaboration
Connectors & Integration
Databricks Runtime for ML
관리형 MLFlow
Autoscaling for optimized performance
Optimized Autoscaling
Audit logs & automated policy controls
감사 로그
Cluster Policies
Extend your cloud-native security for company-wide adoption
역할 기반 액세스 제어
Federated IAM
Customer Managed VPC
Secure Cluster Connectivity
Token Management API
Data analytics and ML for your mission critical workloads
Jobs Light Compute
Run data engineering pipelines to build data lakes.
Jobs Compute Jobs Compute Photon
Run data engineering pipelines to build data lakes and manage data at scale.
Delta Live Tables Delta Live Tables Photon
Easily build high quality streaming or batch ETL pipelines using Python or SQL with the DLT Edition that is best for your workload. Learn more
SQL Compute
Run SQL queries for BI reporting, analytics and visualization to get timely insights from data lakes.
All-Purpose ComputeAll-Purpose Compute Photon
Run interactive data science and machine learning workloads. Also good for data engineering, BI and data analytics.
Compare compute optionsCalculate price
Serverless SQL Compute (Preview)
Serverless SQL provides instant and managed compute that is hosted in Databricks cloud provider account.
Compare compute optionsCalculate price
(FREE during Preview)
Enhanced Security and Compliance
Provides enhanced security and controls for your compliance needs
Workspace for production jobs, analytics, and ML
Optimized Delta Lake
Cluster Autopilot
Jobs Scheduling & Workflow
Databricks SQL Workspace
Databricks SQL Optimization
Notebooks & Collaboration
Connectors & Integration
Databricks Runtime for ML
관리형 MLFlow
Enhanced Security Monitoring1개
HIPAA Compliance Controls1개
PCI-DSS Compliance Controls1개
FedRAMP-Moderate Compliance Controls1개
Optimized performance
Optimized Autoscaling
Enhanced Security Monitoring1개
HIPAA Compliance Controls1개
PCI-DSS Compliance Controls1개
FedRAMP-Moderate Compliance Controls1개
Audit logs & automated policy controls
감사 로그
Cluster Policies
Enhanced Security Monitoring1개
HIPAA Compliance Controls1개
PCI-DSS Compliance Controls1개
FedRAMP-Moderate Compliance Controls1개
Advanced compliance and security for mission critical data
역할 기반 액세스 제어
Federated IAM
Customer Managed VPC
Secure Cluster Connectivity
Token Management API
Customer Managed Keys
IP Access List
Enhanced Security Monitoring1개
HIPAA Compliance Controls1개
PCI-DSS Compliance Controls1개
FedRAMP-Moderate Compliance Controls1개

1Available as Add-on
Pay as you go with a 14-day free trial or contact us for committed-use discounts or custom requirements (e.g., dedicated deployments like Private Cloud).
The pricing is for the Databricks platform only. It does not include pricing for any required AWS resources (e.g., compute instances).
Databricks Unit(DBU)은 시간당 처리 능력의 단위로, 초당 사용량 기준의 요금이 청구됩니다. 지원되는 인스턴스 유형을 확인해 보세요.
Customer success offerings
Databricks provides a range of customer success plans and support to maximize your return on investment with realized impact.
Support
세계적인 수준의 대규모 프로덕션 운영
AWS pricing FAQ
A Databricks Unit (DBU) is a normalized unit of processing power on the Databricks Lakehouse Platform used for measurement and pricing purposes. The number of DBUs a workload consumes is driven by processing metrics which may include the compute resources used and the amount of data processed. For example, 1 DBU is the equivalent of Databricks running on an i3.xlarge machine with the Databricks 8.1 standard runtime for an hour. See the full list of supported instances and details.
Jobs workloads are workloads running on Jobs clusters. Jobs clusters are clusters that are both started and terminated by the same Job. Only one job can be run on a Jobs cluster for isolation purposes. All-Purpose workloads are workloads running on All-Purpose clusters. All-Purpose clusters are clusters that are not classified as Jobs clusters. They can be used for various purposes such as running commands within Databricks notebooks, connecting via JDBC/ODBC for BI workloads, running MLflow experiments on Databricks. Multiple users can share an All-Purpose cluster for doing interactive analysis in a collaborative way.
For Classic compute, Databricks deploys cluster resources into your AWS VPC and you are responsible for paying for EC2 charges. For Serverless compute, Databricks deploys the cluster resources into a VPC in Databricks’ AWS account and you are not required to separately pay for EC2 charges. Please see here for more details.
If your source data is in a different AWS cloud region than the Databricks Serverless environment, AWS may charge you network egress charges. Databricks is currently waiving charges for egress from the Serverless environment to your destination region, but we may charge for such egress at market-competitive rates in the future.
Jobs Light cluster is Databricks’ equivalent of open-source Apache Spark. It targets simple, non-critical workloads that don’t need the performance, reliability, or autoscaling benefits provided by Databricks’ proprietary technologies. In comparison, the Jobs cluster provides you with all of the aforementioned benefits to boost your team productivity and reduce your total cost of ownership.
The 14-day free trial gives you access to either Standard or Premium feature sets depending on your choice of the plan. Contact us if you are interested in Databricks Enterprise or Dedicated plan for custom deployment and other enterprise customizations. Please note that you will still be charged by your cloud provider for resources (e.g. compute instances) used within your account during the free trial.
평가판 사용 기간이 종료되면 무료 평가판을 사용하는 동안 이용했던 요금제를 자동으로 구독하게 됩니다. 구독은 언제든 취소할 수 있습니다.
Databricks Community Edition is a free, limited functionality platform designed for anyone who wants to learn Spark. Sign up here.
By default, you will be billed monthly based on per-second usage on your credit card. Contact us for more billing options, such as billing by invoice or an annual plan.
We offer technical support with our annual commitments. For self-serve options customers are encouraged to also check the technical documentation. Contact us to learn more.
You must contact us for a HIPAA-compliant deployment. Please note that prior to processing any PHI data in Databricks, a signed business associate agreement (BAA) must be in place between your organization and (a) Databricks, Inc.; and (b) because you must have your own account with AWS to deploy Databricks on AWS, Amazon Web Services. Please see here for more details.
Please contact us to get access to preview features.
Yes. We provide our customers with the ability to decide for themselves whether the tradeoffs for additional encryption are necessary given the workloads being processed. Please contact us to enable it.
Product Spend is calculated based on AWS product spend at list, before the application of any discounts, usage credits, add-on uplifts, or support fees.
시작할 준비가 되셨나요?

