Delta Live Tables Product Editions

Delta Live Tables (DLT) is the first ETL framework that uses a simple, declarative approach to building reliable streaming or batch data pipelines, while automatically managing infrastructure at scale. Select the Delta Live Tables Edition that is best for each pipeline based on the product features outlined below.

Product features

 

DLT Core

DLT Pro

DLT Advanced

SQL API

Python API

Streaming Tables

Continuous Pipelines

Auto-loader integration

Observability Metrics with Event Log

Table Lineage

Access Controls (ACL)

Pipeline multi-cluster partitioning

Enhanced Autoscaling (preview)

Change Data Capture (CDC) (link)

Slowly Changing Dimension (SCD) Type 2 (preview)

Data Quality Expectation Rules

Data Quality Expectation Policies

Data Quality Observability

Photon Performance Pack

Optional

Optional

Optional

Observability Data Retention

5 days

30 days

30 days

DLT Pricing

 

DLT Core Compute


DLT Core Compute Photon

DLT Pro Compute


DLT Pro Compute Photon

DLT Advanced Compute


DLT Advanced Compute Photon

Per DBU price - All Tiers (Standard and Premium)

--

--

$0.40

Get started

FAQ

Yes. You can specify a different Edition for each DLT pipeline in the pipeline settings.

You will have the option to choose a DLT product edition when a pipeline is created. Simply click “Create Cluster” from the Delta Live Tables page, and you will see an option to choose the DLT product edition on the subsequent window.

Yes. You may change the DLT Edition of a pipeline at any time through the DLT settings UI.

When you attempt to run a pipeline that includes features which are not included in the DLT Product Edition selected for that pipeline, you will receive an error message that will include a reason for the error. You may then modify the product edition for the pipeline to choose an edition that includes the desired figure, and attempt to re-run the pipeline.

Yes, a DLT cluster will automatically run a periodic system-generated job to maintain the Delta Live Table that will consume DBUs at Jobs Compute Rate.