Databricks auf Google Cloud

Databricks ist eng in die Sicherheits- und Datenservices von Google Cloud eingebunden, um alle Ihre Google Cloud-Daten in einem einfachen, offenen Lakehouse zu verwalten


Kostenlos testenMehr Informationen

Only pay for what you use

Keine Vorabkosten. Sie bezahlen nur die Serverressourcen, die Sie tatsächlich nutzen, und zwar sekundengenau – mit einer einfachen nutzungsbasierten Abrechnung oder Rabatten für die Mindestnutzung.

Standard

One platform for your data analytics and ML workloads

Premium

Data analytics and ML at scale and for mission critical enterprise workloads

Jobs Compute
Jobs Compute Photon

Run data engineering pipelines to build data lakes and manage data at scale.

0.15 / DBU
0.22 / DBU

SQL Compute (Preview)

Run SQL queries for BI reporting, analytics and visualization to get timely insights from data lakes with your favorite SQL and BI tools.

-
0.22 / DBU

DLT Advanced Compute
DLT Advanced Compute Photon (Vorschau)

Easily build high quality streaming or batch ETL pipelines using Python or SQL, perform CDC, and trust your data with quality expectations and monitoring. Learn more

0.40 / DBU
0.40 / DBU

All-Purpose Compute
All-Purpose Compute Photon

Run interactive data science and machine learning workloads. Also good for data engineering, BI and data analytics.

0.40 / DBU
0.55 / DBU
Add On Products

GCP HIPAA Compliance

Provides enhanced security and controls for your HIPAA compliance needs

-
10% of Product Spend

Databricks Workspace

Workspace for production jobs, analytics, and ML
Workspace for production jobs, analytics, and ML
Managed Apache Spark™
Optimized Delta Lake
Cluster Autopilot
Jobs Scheduling & Workflow
Databricks SQL Workspace (Preview)
Databricks SQL Optimization (Preview)
Notebooks & Collaboration
Databricks Runtime für Machine Learning
Connectors & Integration
Managed MLflow

Performance

Up to 50x faster than Apache Spark™
Up to 50x faster than Apache Spark™
Optimized Runtime Engine
Optimized Autoscaling

Governance und Handhabbarkeit

Databricks Workspace administration
Databricks Workspace administration
Administration Console
Audit-Logs
Cluster Policies

Unternehmenssicherheit

Single sign-on
Extend your cloud-native security for company-wide adoption
Single Sign-On (SSO)
Rollenbasierte Zugriffskontrolle
Token Management API
Secure Cluster Connectivity
IP Access List
HIPAA Compliance Controls
1

Standard

One platform for your data analytics and ML workloads

0.15 / DBU

Jobs Compute
Jobs Compute Photon

Run data engineering pipelines to build data lakes and manage data at scale.

0.40 / DBU

DLT Advanced Compute
DLT Advanced Compute Photon (Vorschau)

Easily build high quality streaming or batch ETL pipelines using Python or SQL, perform CDC, and trust your data with quality expectations and monitoring. Learn more

0.40 / DBU

All-Purpose Compute
All-Purpose Compute Photon

Run interactive data science and machine learning workloads. Also good for data engineering, BI and data analytics.

Compare compute options
Calculate price
Databricks Workspace

Workspace for production jobs, analytics, and ML

Managed Apache Spark™
Optimized Delta Lake
Cluster Autopilot
Jobs Scheduling & Workflow
Notebooks & Collaboration
Databricks Runtime für Machine Learning
Connectors & Integration
Managed MLflow
Performance

Up to 50x faster than Apache Spark™

Optimized Runtime Engine
Optimized Autoscaling
Governance und Handhabbarkeit

Databricks Workspace administration

Administration Console
Audit-Logs
Cluster Policies
Unternehmenssicherheit

Single sign-on

Single Sign-On (SSO)
Secure Cluster Connectivity
Premium

Data analytics and ML at scale and for mission critical enterprise workloads

0.22 / DBU

Jobs Compute
Jobs Compute Photon

Run data engineering pipelines to build data lakes and manage data at scale.

0.22 / DBU

SQL Compute (Preview)

Run SQL queries for BI reporting, analytics and visualization to get timely insights from data lakes with your favorite SQL and BI tools.

0.40 / DBU

DLT Advanced Compute
DLT Advanced Compute Photon (Vorschau)

Easily build high quality streaming or batch ETL pipelines using Python or SQL, perform CDC, and trust your data with quality expectations and monitoring. Learn more

0.55 / DBU

All-Purpose Compute
All-Purpose Compute Photon

Run interactive data science and machine learning workloads. Also good for data engineering, BI and data analytics.

Compare compute options
Calculate price
10% of Product Spend

GCP HIPAA Compliance

Provides enhanced security and controls for your HIPAA compliance needs

Databricks Workspace

Workspace for production jobs, analytics, and ML

Managed Apache Spark™
Optimized Delta Lake
Cluster Autopilot
Jobs Scheduling & Workflow
Databricks SQL Workspace (Preview)
Databricks SQL Optimization (Preview)
Notebooks & Collaboration
Databricks Runtime für Machine Learning
Connectors & Integration
Managed MLflow
Performance

Up to 50x faster than Apache Spark™

Optimized Runtime Engine
Optimized Autoscaling
Governance und Handhabbarkeit

Databricks Workspace administration

Administration Console
Audit-Logs
Cluster Policies
Unternehmenssicherheit

Extend your cloud-native security for company-wide adoption

Single Sign-On (SSO)
Rollenbasierte Zugriffskontrolle
Token Management API
Secure Cluster Connectivity
IP Access List
HIPAA Compliance Controls1
Hintergrundbild

1Available as Add-on

Pay as you go with a 14-day free trial or contact us
for committed-use discounts.

Preisangaben beziehen sich ausschließlich auf die Databricks-Plattform. Die Preise für die erforderlichen GCP-Ressourcen (z. B. Serverinstanzen) sind nicht enthalten.

Eine Databricks Unit (DBU) ist eine Verarbeitungskapazitätseinheit bezogen auf eine Stunde, die auf der Grundlage der Nutzung sekundengenau abgerechnet wird. Unterstützte Instanztypen anzeigen.

Customer success offerings

Databricks provides a range of customer success plans and support to maximize your return on investment with realized impact.

Weiterbildung

Weiterbildung

Experten für Daten und KI fortbilden

Support

Support

Herausragende Produktionsverfahren im großen Stil

Professionelle Services

Professionelle Services

Accelerating your business outcomes

Estimate your price

Use our comprehensive price calculator to estimate your cost for
different Databricks workloads and the types of supported instances.

Preis berechnen →

GCP pricing FAQ

A Databricks Unit (DBU) is a normalized unit of processing power on the Databricks Lakehouse Platform used for measurement and pricing purposes. The number of DBUs a workload consumes is driven by processing metrics, which may include the compute resources used and the amount of data processed.

Jobs workloads are workloads running on Jobs clusters. Jobs clusters are clusters that are both started and terminated by the same Job. Only one job can be run on a Jobs cluster for isolation purposes. All-Purpose workloads are workloads running on All-Purpose clusters. All-Purpose clusters are clusters that are not classified as Jobs clusters. They can be used for various purposes such as running commands within Databricks notebooks, connecting via JDBC/ODBC for BI workloads, running MLflow experiments on Databricks. Multiple users can share an All-Purpose cluster for doing interactive analysis in a collaborative way.

Jobs Light cluster is Databricks’ equivalent of open-source Apache Spark. It targets simple, non-critical workloads that don’t need the performance, reliability, or autoscaling benefits provided by Databricks’ proprietary technologies. In comparison, the Jobs cluster provides you with all of the aforementioned benefits to boost your team productivity and reduce your total cost of ownership.

The 14-day free trial gives you access to either Standard or Premium feature sets depending on your choice of the plan. Contact us if you are interested in Databricks Enterprise or Dedicated plan for custom deployment and other enterprise customizations.

Am Ende der Testversion werden Sie automatisch für den Tarif angemeldet, den Sie während der kostenlosen Testversion genutzt haben. Sie können Ihre Anmeldung jederzeit kündigen.

Die Abrechnung erfolgt standardmäßig monatlich über Ihre Kreditkarte. Abgerechnet wird Ihre Nutzung unserer Dienstleistungen pro Sekunde. Kontaktieren Sie uns für weitere Abrechnungsoptionen, wie z. B. die Abrechnung über Rechnungsstellung oder einen jährlichen Abrechnungsplan.

We offer technical support with annual commitments. Contact us to learn more or get started.

Please contact us to get access to preview features.

Product Spend is calculated based on GCP product spend at list, before the application of any discounts, usage credits, add-on uplifts, or support fees.

Möchten Sie loslegen?