Skip to main content

What’s new from
Data + AI Summit

Explore the latest breakthroughs in data and AI — from product launches to what’s coming next
INTRODUCING

Lakebase

The first fully managed, Postgres-compatible transactional database engine designed for developers and AI agents
INTRODUCING

Mosaic AI Agent Bricks

Production AI agents optimized on your data
PRIVATE PREVIEW

Lakeflow Designer

Production quality ETL. No code required.
ADDITIONAL HIGHLIGHTS

Explore more of the latest product launches

lakeflow ga

Lakeflow

Now in General Availability, Lakeflow is an end-to-end data engineering solution to build reliable data pipelines faster with all business-critical data. We’ve expanded ingestion capabilities and added a new declarative pipelines authoring experience.

Read the blog
community edition

Databricks Free Edition

Students and aspiring professionals will soon be able to develop critical skills in data and AI — for free. Ingest data, build dashboards and train AI models on the same platform trusted by more than 60% of the Fortune 500. Experimentation across the full extent of use cases will be open to all. Now in Public Preview.

Read the blog
databricks one

Databricks One

Business users will soon be able to interact with AI/BI Dashboards, ask their data questions in natural language through AI/BI Genie powered by deep research, quickly find relevant dashboards, and use custom-built Databricks Apps — all in an elegant, code-free environment. Now in Private Preview.

Read the blog
ai bi genie

AI/BI Genie

Now in General Availability, AI/BI Genie allows users to ask questions in natural language and get instant insights from data — no coding required. Genie delivers answers via text summaries, tabular data and visualizations, along with how it arrived at the answer.

Coming soon: To answer complex questions, Genie Deep Research creates research plans and analyzes multiple hypotheses — complete with citations.

Read the blog
unity catalog

Unity Catalog

Unity Catalog is a unified governance solution for modern data, AI and business. Now it’s the first catalog to enable interoperability across compute engines, first-party support for multiple table formats and unified business semantics. Explore the new capabilities now in preview.

Read the blog

Unity Catalog Metrics

Defining metrics at the data layer makes business semantics reusable across all workloads. Create metrics once in Unity Catalog and use them across AI/BI Dashboards, Genie, Notebooks, SQL and Lakeflow jobs.

Unity Catalog Discover

Find “the good stuff” with a curated internal marketplace of certified data products organized by business domains like Sales, Marketing and Finance. AI-powered recommendations help surface the highest-value assets.

Unity Catalog Iceberg Managed Tables

Govern Iceberg tables in Unity Catalog and access from anywhere. Get best-in-class price and performance, liquid clustering, predictive optimization — and full integration with Databricks and across external engines.

databricks apps

Databricks Apps

Build, deploy and scale interactive data intelligence apps within your fully governed and secure Databricks environment to rapidly deliver user-facing tools — from LLM copilots and data quality dashboards to team-specific ops apps — where your data and AI assets already live.

Read the blog
lakebridge

Lakebridge

Automate migration from legacy data warehouses to Databricks SQL — and speed up implementation by 2x. Setting a new standard for end-to-end migration, Lakebridge includes profiling, assessing, converting, validating and reconciling.

Read the blog
clean rooms

Databricks Clean Rooms

Now in General Availability on Google Cloud, Databricks Clean Rooms extends support for a truly comprehensive multicloud offering. Create a central clean room environment and collaborate with partners across AWS, Azure, GCP or any other data platform.

Read the blog
BUILT TO SHARE

Open source innovation

spark declarative pipelines

Spark Declarative Pipelines

Define and execute data pipelines for both batch and streaming ETL workloads across any Apache SparkTM-supported data source — including cloud storage, message buses, change data feeds and external systems.

Read the blog
CUSTOMER SUCCESS STORIES

Data intelligence in action

Ready to become a
data + AI company?

Take the first steps in your data transformation