---
title: Platform overview
description: How Databricks Apps, AppKit, Lakebase, and Agent Bricks fit together. Architecture reference for the Databricks developer stack.
---

# Platform overview

Apps, AppKit, Lakebase Postgres, and Agent Bricks are layers of a full-stack Databricks application running inside your workspace. New to building on Databricks? See [Start here](https://databricks.com/devhub/docs/start-here) first.

<img src="/img/docs/platform-overview.svg" alt="Architecture diagram showing Apps containing AppKit, Lakebase, and Agents, with Unity Catalog and AI Gateway as workspace services" width="100%" />

| Layer                 | What it is                                                                                                                                                                                                           |
| --------------------- | -------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| **Apps**              | The hosting layer. Your app runs as a managed workspace resource with a fixed URL, built-in auth, and managed compute. Deploy with `databricks apps deploy`.                                                         |
| **AppKit**            | The TypeScript SDK for building apps on Databricks Apps. Built-in Databricks OAuth handling, pre-built React UI components (data tables, charts, dialogs), and a plugin system for connecting to workspace services. |
| **Lakebase Postgres** | The database layer. Managed Postgres for OLTP, co-located with your Lakehouse. Autoscales on demand, scales to zero when idle, and supports branching for development environments.                                  |
| **Agent Bricks**      | The AI layer. Call Knowledge Assistants, Supervisor Agents, and governed LLM endpoints via the Model Serving plugin. Query Unity Catalog tables in natural language via the Genie plugin.                            |
| **Data Lakehouse**    | The analytical data tier. Unity Catalog tables, materialized views, and streaming tables populated by Lakeflow. Read with the Analytics plugin. Trigger Lakeflow Jobs from your app.                                 |

## How a request flows

A user opens the app, and Databricks Apps authenticates them via workspace SSO. The relevant AppKit plugin then queries Lakebase, a Genie space, or a serving endpoint on their behalf, using either the app's service principal or the user's forwarded token.

## Where to next

These docs go deeper on each platform layer:

- **[Web apps (Databricks Apps)](https://databricks.com/devhub/docs/apps/quickstart)**: Scaffold and deploy an AppKit app (TypeScript) or a Python app on Databricks Apps.
- **[Lakebase Postgres](https://databricks.com/devhub/docs/lakebase/quickstart)**: Provision a Lakebase Postgres project and connect it to an app.
- **[Data Lakehouse](https://databricks.com/devhub/docs/lakehouse/overview)**: Read Unity Catalog tables with the Analytics plugin, trigger Lakeflow Jobs, and surface data freshness.
- **[Agent Bricks](https://databricks.com/devhub/docs/agents/overview)**: Governed model endpoints ([AI Gateway](https://databricks.com/devhub/docs/agents/ai-gateway)), natural-language data queries ([Genie](https://databricks.com/devhub/docs/agents/genie)), and custom agent endpoints ([Custom agents](https://databricks.com/devhub/docs/agents/custom-agents)).
- **[Set up your environment](https://databricks.com/devhub/docs/tools/databricks-cli)**: Databricks CLI, agent skills, and MCP server.
- **[AppKit Reference](https://databricks.com/devhub/docs/appkit/v0)**: Component library, plugin API, and TypeScript SDK reference.
