Building internal tools or AI‑powered applications the “traditional” way throws developers into a maze of repetitive, error‑prone tasks. First, they must spin up a dedicated Postgres instance, configure networking, backups, and monitoring, and then spend hours (or days) plumbing that database into the front‑end framework they’re using. On top of that, they have to write custom authentication flows, map granular permissions, and keep those security controls in sync across the UI, API layer, and database. Each application component lives in a different environment, from a managed cloud service to a self‑hosted VM. This forces developers to juggle disparate deployment pipelines, environment variables, and credential stores. The result is a fragmented stack where a single change, like a schema migration or a new role, ripples through multiple systems, demanding manual updates, extensive testing, and constant coordination. All of this overhead distracts developers from the real value‑add: building the product’s core features and intelligence.
With Databricks Lakebase and Databricks Apps, the entire application stack sits together, alongside the lakehouse. Lakebase is a fully managed Postgres database that offers low-latency reads and writes, integrated with the same underlying lakehouse tables that power your analytics and AI workloads. Databricks Apps supplies a serverless runtime for the UI, along with built-in authentication, fine-grained permissions, and governance controls that are automatically applied to the same data that Lakebase serves. This makes it easy to build and deploy apps that combine transactional state, analytics, and AI without stitching together multiple platforms, synchronizing databases, replicating pipelines, or reconciling security policies across systems.
Lakebase and Databricks Apps work together to simplify full-stack development on the Databricks platform:
By combining the two, you can build interactive tools that store and update state in Lakebase, access governed data in the lakehouse, and serve everything through a secure, serverless UI, all without managing separate infrastructure. In the example below, we’ll show how to build a simple holiday request approval app using this setup.
This walkthrough shows how to create a simple Databricks App that helps managers review and approve holiday requests from their team. The app is built with Databricks Apps and uses Lakebase as the backend database to store and update the requests.
Here’s what the solution covers:
The walkthrough is designed to get you started quickly with a minimal working example. Later, you can extend it with more advanced configuration.
Before building the app, you’ll need to create a Lakebase database. To do this, go to the Compute tab, select OLTP Database, and provide a name and size. This provisions a serverless Lakebase instance. In this example, our database instance is called lakebase-demo-instance.
Now that we have a database, let’s create the Databricks App that will connect to it. You can start from a blank app or choose a template (e.g., Streamlit or Flask). After naming your app, add the Database as a resource. In this example, the pre-created databricks_postgres database is selected.
Adding the Database resource automatically:
This role will later be used to grant table-level access.
With the database provisioned and the app connected, you can now define the schema and table the app will use.
From the app’s Environment tab, copy the value of the DATABRICKS_CLIENT_ID variable. You’ll need this for the GRANT statements.
Go to your Lakebase instance and click New Query. This opens the SQL editor with the database endpoint already selected.
Please note that while using the SQL editor is a quick and effective way to perform this process, managing database schemas at scale is best handled by dedicated tools that support versioning, collaboration, and automation. Tools like Flyway and Liquibase allow you to track schema changes, integrate with CI/CD pipelines, and ensure your database structure evolves safely alongside your application code.
With permissions in place, you can now build your app. In this example, the app fetches holiday requests from Lakebase and lets a manager approve or reject them. Updates are written back to the same table.
Use SQLAlchemy and the Databricks SDK to connect your app to Lakebase with secure, token-based authentication. When you add the Lakebase resource, PGHOST and PGUSER are exposed automatically. The SDK handles token caching.
The following functions read from and update the holiday request table:
The code snippets above can be used in combination with frameworks such as Streamlit, Dash and Flask to pull the data from Lakebase and visualize it in your app. To ensure all necessary dependencies are installed, add the required packages to your app’s requirements.txt file. The packages used in the code snippets are listed below.
Lakebase adds transactional capabilities to the lakehouse by integrating a fully managed OLTP database directly into the platform. This reduces the need for external databases or complex pipelines when building applications that require both reads and writes.
Because it’s natively integrated with Databricks, including data synchronization, identity authentication, and network security — just like other data assets in the lakehouse. You don’t need custom ETL or reverse ETL to move data between systems. For example:
These capabilities make it easier to support production-grade use cases like:
Lakebase is already being used across industries for applications including personalized recommendations, chatbot applications, and workflow management tools.
If you’re already using Databricks for analytics and AI, Lakebase makes adding real-time interactivity to your applications easier. With support for low-latency transactions, built-in security, and tight integration with Databricks Apps, you can go from prototype to production without leaving the platform.
Lakebase provides a transactional Postgres database that works seamlessly with Databricks Apps, and provides easy integration with Lakehouse data. It simplifies the development of full-stack data and AI applications by eliminating the need for external OLTP systems or manual integration steps.
In this example, we showed how to:
Lakebase is now in Public Preview. You can try it today directly from your Databricks workspace. For details on usage and pricing, see the Lakebase and Apps documentation.