---
title: App configuration
sidebar_label: Configuration
---

# App configuration

Two files control how your AppKit app starts and what it connects to: `app.yaml` (runtime behavior and environment variables) and `databricks.yml` (Databricks resources). Each app gets a fixed URL assigned at creation. It cannot change.

:::tip[Building with Python?]

AppKit targets TypeScript on Node.js. Python app development is not covered on this site. See the [Databricks Apps docs](https://docs.databricks.com/aws/en/dev-tools/databricks-apps/) for Python frameworks (Gradio, Streamlit, Dash).

:::

## Configuration files

**`app.yaml`** controls runtime behavior (startup command and environment variables):

```yaml
command: ["npm", "run", "start"]
env:
  - name: LAKEBASE_ENDPOINT
    valueFrom: postgres
  - name: WAREHOUSE_ID
    valueFrom: sql-warehouse
```

The `command` is a sequence (array), not a shell string. Environment variable expansion is not supported in `command` except for `DATABRICKS_APP_PORT`.

**`databricks.yml`** declares Databricks resources, variables, and deployment targets:

```yaml
resources:
  apps:
    my-app:
      resources:
        - name: postgres
          postgres:
            branch: ${var.postgres_branch}
            database: ${var.postgres_database}
            permission: CAN_CONNECT_AND_CREATE
```

Variables like `${var.postgres_branch}` are resolved from the `variables` section of `databricks.yml` or from CLI flags at deploy time.

For the full AppKit-specific `app.yaml` reference including plugin resource bindings, see [AppKit configuration](https://databricks.com/devhub/docs/appkit/v0/configuration).

## Plugin manifest

Each AppKit app has an `appkit.plugins.json` that declares which plugins are active and what Databricks resources they require. This file is auto-generated by running:

```bash
npx @databricks/appkit plugin sync --write
```

This runs automatically during `npm run dev` and `npm run build`. Commit it alongside your code. The CLI and deployment pipeline use it to provision resources.

## Resources

Apps access Databricks services through declared resources. Declare resources in `databricks.yml` and bind them to environment variables in `app.yaml` using `valueFrom`. Common resource types used in AppKit templates:

| Resource                                                                             | `valueFrom` key    | What it provides               |
| ------------------------------------------------------------------------------------ | ------------------ | ------------------------------ |
| [Lakebase Postgres](https://databricks.com/devhub/docs/lakebase/quickstart)                                       | `postgres`         | PostgreSQL connection          |
| [SQL Warehouse](https://docs.databricks.com/aws/en/compute/sql-warehouse/index.html) | `sql-warehouse`    | SQL query execution            |
| [Model Serving](https://databricks.com/devhub/docs/agents/ai-gateway)                                             | `serving-endpoint` | AI model inference             |
| [Genie space](https://databricks.com/devhub/docs/agents/genie)                                                    | `genie-space`      | Natural language data queries  |
| [Job](https://docs.databricks.com/aws/en/dev-tools/databricks-apps/resources)        | `job`              | Scheduled or triggered job     |
| [UC Volumes](https://docs.databricks.com/aws/en/files/index.html)                    | `volume`           | File storage                   |
| [Secrets](https://docs.databricks.com/aws/en/dev-tools/databricks-apps/secrets)      | `secret`           | Sensitive configuration values |

Additional resource types (Unity Catalog tables, connections, vector search indexes, MLflow experiments, and others) are listed in the [official resources documentation](https://docs.databricks.com/aws/en/dev-tools/databricks-apps/resources).

## Environment variables

The platform injects these variables automatically at runtime:

| Variable                   | Description                     |
| -------------------------- | ------------------------------- |
| `DATABRICKS_HOST`          | Workspace URL                   |
| `DATABRICKS_APP_PORT`      | Port your app must listen on    |
| `DATABRICKS_APP_NAME`      | App name                        |
| `DATABRICKS_CLIENT_ID`     | Service principal client ID     |
| `DATABRICKS_CLIENT_SECRET` | Service principal client secret |
| `DATABRICKS_WORKSPACE_ID`  | Workspace ID                    |

Custom variables go in `app.yaml` under `env`. Use `value` for plain values, `valueFrom` for resource bindings and secrets. Never put secrets in `value`.

## Auth model

Each app gets a dedicated service principal. Databricks injects `DATABRICKS_CLIENT_ID` and `DATABRICKS_CLIENT_SECRET` automatically at runtime and deletes the service principal when the app is deleted.

**User authorization** (Public Preview) forwards the signed-in user's token through the `x-forwarded-access-token` HTTP header. Scopes (for example, `sql`, `dashboards.genie`, `files.files`) are configured in the workspace UI. AppKit's built-in [Genie](https://databricks.com/devhub/docs/agents/genie) and [Model Serving](https://databricks.com/devhub/docs/agents/ai-gateway) plugins use this automatically. See [execution context](https://databricks.com/devhub/docs/appkit/v0/plugins/execution-context) for the AppKit implementation, or [app authorization](https://docs.databricks.com/aws/en/dev-tools/databricks-apps/auth) for the full platform details.

## Compute

| Size   | vCPU    | RAM   | DBU |
| ------ | ------- | ----- | --- |
| Medium | Up to 2 | 6 GB  | 0.5 |
| Large  | Up to 4 | 12 GB | 1.0 |

Medium is the default. Compute size is configured in the workspace UI (not available through the CLI).

## Constraints

- No durable filesystem (use [Lakebase Postgres](https://databricks.com/devhub/docs/lakebase/quickstart), DBSQL, or UC Volumes for persistence)
- Files larger than 10 MB fail deployment
- SIGTERM gives 15 seconds before SIGKILL
- Runtime: Ubuntu 22.04, Node 22, Python 3.11

See [Best practices](https://docs.databricks.com/aws/en/dev-tools/databricks-apps/best-practices) for guidelines on shutdown handling, secrets hygiene, and networking.

## App statuses

| Status    | Meaning                            |
| --------- | ---------------------------------- |
| Running   | App is healthy and serving traffic |
| Deploying | New deployment is in progress      |
| Crashed   | App failed to start or exited      |
| Stopped   | App was manually stopped           |

## Where to next

See [Apps development](https://databricks.com/devhub/docs/apps/development) for local setup, deploy flags, and the full plugin API, or browse the [templates catalog](https://databricks.com/devhub/templates) for complete patterns.
