We’re excited to announce Lakeflow Designer, an AI-powered, no-code pipeline builder that is fully integrated with the Databricks Data Intelligence Platform. With a visual canvas and built-in natural language interface, Designer lets business analysts build scalable production pipelines and perform data analysis without writing a single line of code–all in a single, unified product.
Every pipeline built in Designer creates a Lakeflow Declarative Pipeline under the hood. Data engineers can review, understand, and improve these pipelines without switching tools or rewriting logic because it’s the same ANSI SQL standard used across Databricks. This lets no-code users participate in data work without creating additional overhead for engineers.
Lakeflow Designer will be available in Private Preview in the coming months, following the General Availability of Lakeflow announced today. We’re excited for you to see the impact it can make.
Data teams and business analysts want the same thing: to turn raw data into insights fast. But the tools they use and the environments they work in often pull them in different directions.
Business analysts bring valuable domain knowledge and insight into the questions that drive an organization’s decisions. To move fast, they often create quick solutions using spreadsheets or, when those aren't enough, they turn to legacy no-code tools. These tools are easy and straightforward, but the pipelines they create live outside the data platform in different environments, separated from the pipelines engineers create and maintain.
This separation creates three persistent challenges:
Lakeflow Designer solves these problems by bringing business and data teams into a single, unified environment where pipelines are created, managed, and governed on Databricks. With all pipelines built natively within the platform, teams gain built-in observability, governance, and scale from day one: no rewrites required.
Lakeflow Designer directly addresses siloed workflows by providing a unified platform where teams can build together. Business users create pipelines in a visual, no-code environment that’s familiar and intuitive. Behind the scenes, visual pipelines are implemented as Lakeflow Declarative Pipelines. They use the same scalable, reliable runtime as pipelines developed directly within Lakeflow Declarative Pipelines.
Since Designer’s output is the same as if it were coded, data engineers can inspect and edit the pipelines just like any other Lakeflow pipeline. That means if business analysts run into issues, engineers can jump in to help without needing to learn a new tool because it’s the same Declarative Pipelines they’re used to. This all results in smoother handoffs and less rebuilds that take up engineering cycles.
Lakeflow Designer pipelines are deployed as Lakeflow Declarative Pipelines–full stop. This means they’re immediately ready for production use. They’re versioned, governed by Unity Catalog, and fully observable with Lakeflow monitoring tools.
Because everything is native to the Databricks platform, you don’t need to re-platform pipelines to make them production-grade. You get scheduling, testing, and alerting from day one.
Designer delivers an AI-first development experience, helping users move from idea to pipeline with natural language prompts. Unique to Designer’s AI is that it is grounded in the structure, semantics, and usage patterns of your data, made possible by the unified Databricks Intelligence Platform. It knows how data is actually talked about and used across the business, including table definitions, column names, and query history.
Other tools may offer AI features, but because they live outside the platform, they operate without this rich context. Designer’s AI is different: deeply integrated with Databricks, trained on your data's structure and semantics, and built to generate trustworthy, production-grade pipelines that align with your existing workflows and governance standards.
Lakeflow Designer will be available in Private Preview in the coming months. We’re collaborating closely with early users across industries to refine the experience and expand access.
If your team wants to give more users the ability to build trusted pipelines without increasing risk or adding new tools, get in touch with your Databricks account team to request access.