Skip to main content

App development

This page is the CLI and workflow reference for Databricks Apps and AppKit. It covers adding plugins, scaffolding, deploying, managing, and troubleshooting your app.

Local setup

Copy .env.example to .env and fill in your workspace URL and resource IDs before running npm run dev. AppKit reads these for local connections to Databricks resources.

Example .env for an app with Lakebase Postgres:

DATABRICKS_HOST=https://<workspace>.cloud.databricks.com
LAKEBASE_ENDPOINT=projects/<project>/branches/production/endpoints/primary

If your app uses Lakebase, also grant your local user the databricks_superuser role before running locally. The app's service principal creates schemas and tables on first deploy and owns them. Without this grant, your local identity cannot access those objects:

GRANT databricks_superuser TO "<your-email>";

See Lakebase Development for the full local access workflow.

For testing against production data without redeploying, see the remote bridge.

Add a plugin

To add a plugin to an existing app, import and register it in createApp in server/server.ts:

import { createApp, server, lakebase, genie } from "@databricks/appkit";

const AppKit = await createApp({
plugins: [server(), lakebase(), genie()],
});

Then regenerate appkit.plugins.json with the updated resource requirements:

npx @databricks/appkit plugin sync --write

This runs automatically during npm run dev and npm run build. Commit the updated appkit.plugins.json alongside your code. It tells the deployment pipeline which resources to provision.

See the AppKit plugins reference for configuration options for each plugin, or Creating custom plugins to add your own.

Discover plugins

List available plugins and their required resource fields:

databricks apps manifest
Options
OptionRequiredDescription
--templatenoTemplate path (local directory or GitHub URL). Default: AppKit template
--branchnoGit branch or tag (mutually exclusive with --version)
--versionnoAppKit version for default template (default: main)
--debugnoEnable debug logging
-o jsonnoOutput as JSON (default: text)
--targetnoBundle target to use (if applicable)
--varnoSet values for bundle config variables (for example, --var="key=value")
--profilenoDatabricks CLI profile name

Scaffold options

Use databricks apps init to scaffold a new AppKit project. The Apps Quickstart shows the fast path. Use these options for non-interactive or advanced scaffolding.

databricks apps init --name my-app
Options
OptionRequiredDescription
--namenoApp name (lowercase, hyphenated, 26 chars max). Suppresses prompts and applies defaults for other flags
--featuresnoComma-separated plugins to enable (for example, lakebase, analytics, genie)
--setnoResource values: plugin.resourceKey.field=value. Multi-field resources require all fields together
--descriptionnoApp description
--output-dirnoDirectory to write the project to
--deploynoDeploy the app after creation
--runnoRun after creation: none, dev, or dev-remote
--templatenoTemplate path (local directory or GitHub URL)
--branchnoGit branch or tag (for GitHub templates, mutually exclusive with --version)
--versionnoAppKit version to use (default: latest release, latest for main branch)
--debugnoEnable debug logging
-o jsonnoOutput as JSON (default: text)
--targetnoBundle target to use (if applicable)
--varnoSet values for bundle config variables (for example, --var="key=value")
--profilenoDatabricks CLI profile name

Passing --name suppresses prompts and uses defaults for unspecified options. App names must be lowercase, hyphenated, and 26 characters or fewer. Run databricks apps manifest to see available plugins and their --set keys.

Environment configuration

Local (npm run dev): variables from .env in the project root.

Deployed: variables from app.yaml env entries. Use value for plain strings and valueFrom for resource bindings:

env:
- name: LAKEBASE_ENDPOINT
valueFrom: postgres
- name: WAREHOUSE_ID
valueFrom: sql-warehouse
- name: APP_LOG_LEVEL
value: info

Resources referenced by valueFrom must be declared in databricks.yml. See App configuration for the full resource list.

Pre-deploy checklist

Before deploying to production:

  • App binds to 0.0.0.0 on DATABRICKS_APP_PORT
  • app.yaml command uses array syntax (no shell strings)
  • No files larger than 10 MB in the project
  • Secrets use valueFrom (never value)
  • databricks.yml declares all required resources
  • databricks apps validate succeeds (--skip-tests skips tests for a faster run)
  • npm run build succeeds locally

Validate

Run validation from your app project directory before deploying:

databricks apps validate --profile $DATABRICKS_PROFILE

Validation runs a build, typecheck, and lint. Pass --skip-tests for a faster run.

Deploy

databricks apps deploy
Options
OptionRequiredDescription
APP_NAMEnoApp name. Omit when running from a project directory (auto-detected from databricks.yml)
--auto-approvenoSkip interactive approvals that might be required for deployment
--skip-validationnoSkip project validation (build, typecheck, lint)
--skip-testsnoSkip running tests during validation (default: true)
--forcenoForce-override Git branch validation
--no-waitnoReturn immediately instead of waiting for SUCCEEDED state
--timeoutnoMax time to wait for completion (default: 20m)
--modenoSource code mode: AUTO_SYNC or SNAPSHOT
--deployment-idnoUnique deployment identifier
--source-code-pathnoWorkspace file system path for source code
--jsonnoInline JSON or @path/to/file.json with request body
--debugnoEnable debug logging
-o jsonnoOutput as JSON (default: text)
--targetnoBundle target to use (if applicable)
--varnoSet values for bundle config variables (for example, --var="key=value")
--profilenoDatabricks CLI profile name

The CLI validates configuration, builds the project, uploads it, and starts the app. By default it runs the same project validation as databricks apps validate (build, typecheck, lint). Pass --skip-validation to skip that step. No --source-code-path is needed when deploying from a scaffolded AppKit project.

Verify the deployment

Check that the app deployed successfully:

databricks apps get my-app -o json
Options
OptionRequiredDescription
NAMEyesApp name
-o jsonnoOutput as JSON
--debugnoEnable debug logging
--targetnoBundle target to use (if applicable)
--varnoSet values for bundle config variables (for example, --var="key=value")
--profilenoDatabricks CLI profile name
Example output
{
"name": "my-app",
"url": "https://my-app-1234567890.us-west-2.databricksapps.com",
"description": "A Databricks App powered by AppKit",
"compute_size": "MEDIUM",
"app_status": {
"message": "App has status: App is running",
"state": "RUNNING"
},
"compute_status": {
"message": "App compute is running.",
"state": "ACTIVE"
},
"active_deployment": {
"deployment_id": "a1b2c3d4e5f6",
"source_code_path": "/Workspace/Users/[email protected]/.bundle/my-app/default/files",
"status": {
"message": "App started successfully",
"state": "SUCCEEDED"
}
},
"resources": [
{
"name": "postgres",
"postgres": {
"branch": "projects/my-project/branches/production",
"database": "projects/my-project/branches/production/databases/db-abc123",
"permission": "CAN_CONNECT_AND_CREATE"
}
}
],
"service_principal_client_id": "a1b2c3d4-e5f6-7890-abcd-ef1234567890"
}

View logs:

databricks apps logs my-app
Options
OptionRequiredDescription
NAMEnoApp name. Omit from project directory (auto-detected)
-f / --follownoStream logs until interrupted
--tail-linesnoRecent log lines to show before streaming (default: 200, 0 for all)
--timeoutnoMax streaming time with --follow (0 disables)
--searchnoSearch term to filter logs
--sourcenoFilter by source: APP, SYSTEM, or both
--output-filenoFile path to write logs (in addition to stdout)
--debugnoEnable debug logging
-o jsonnoOutput as JSON (default: text)
--targetnoBundle target to use (if applicable)
--varnoSet values for bundle config variables (for example, --var="key=value")
--profilenoDatabricks CLI profile name
Example log output
[SYSTEM] [INFO] Starting Databricks Apps runtime...
[SYSTEM] [INFO] Starting deployment a1b2c3d4e5f6...
[SYSTEM] [INFO] Downloading source code from /Workspace/Users/.../src/a1b2c3d4e5f6
[SYSTEM] [INFO] Installing dependencies...
[BUILD] added 899 packages, and audited 900 packages in 21s
[SYSTEM] [INFO] Dependencies installed successfully.
[SYSTEM] [INFO] Running build script npm run build:server && npm run build:client
[BUILD] ✔ Build complete in 30ms
[BUILD] ✓ built in 2.80s
[SYSTEM] [INFO] Build completed successfully.
[SYSTEM] [INFO] Starting app with command: [npm run start]
[APP] [appkit:lakebase] Lakebase pool initialized
[APP] [appkit:server] Server running on http://0.0.0.0:8000
[APP] [appkit:server] Mode: production (static)

Managing apps

databricks apps stop my-app
databricks apps start my-app
databricks apps delete my-app
Options
OptionRequiredDescription
NAMEnoApp name. Omit from project directory (auto-detected)
--no-waitnoReturn immediately (stop/start only)
--timeoutnoMax time to wait for completion (default: 20m, stop/start only)
--auto-approvenoSkip confirmation prompts (delete only)
--force-locknoForce acquisition of deployment lock (delete only)
--debugnoEnable debug logging
-o jsonnoOutput as JSON (default: text)
--targetnoBundle target to use (if applicable)
--varnoSet values for bundle config variables (for example, --var="key=value")
--profilenoDatabricks CLI profile name

apps delete prompts for confirmation. Pass --auto-approve in CI to skip the prompt.

CI/CD

For automated deploys in CI, set DATABRICKS_HOST and DATABRICKS_TOKEN (or use OAuth with DATABRICKS_CLIENT_ID and DATABRICKS_CLIENT_SECRET):

DATABRICKS_HOST=https://<workspace>.cloud.databricks.com \
DATABRICKS_TOKEN=dapi... \
databricks apps deploy

Or use a pre-configured profile:

databricks apps deploy --profile ci-profile

See the Databricks CLI authentication docs for all auth methods.

Troubleshooting

For additional troubleshooting, see Deploy apps and the AppKit remote bridge for local connection issues.

  • App fails to deploy: Check logs for error messages, validate app.yaml syntax, and verify that secrets and environment variables in the env section resolve properly. Confirm all dependencies are included or installed.
  • 401 errors (authentication): Verify your token is valid (databricks auth token --profile <PROFILE>), hasn't expired, and includes the required OAuth scopes. Your token's scopes must be a superset of the scopes configured for the app's user authorization.
  • 403 errors (permission denied): Verify you have CAN USE permission on the app. Insufficient OAuth scopes can also cause 403s even with valid permissions.
  • 404 errors (app not found): Verify the app name and workspace URL are correct, the app is deployed and running, and the endpoint path exists.
  • Git deployment fails: For private repositories, verify the app's service principal has a Git credential configured. If deploying through CLI/API/DABs, create the app first, then add the Git credential.

AppKit docs

Access the AppKit API reference, component docs, and plugin docs from the terminal:

npx @databricks/appkit docs                        # browse the documentation index
npx @databricks/appkit docs --full # full index with all API entries
npx @databricks/appkit docs "<query-or-doc-path>" # view a specific section or file

Run without arguments to browse the index. Useful when building with an AI coding assistant. Point it here instead of guessing API shapes, or view the AppKit reference on this site.

Where to next

Browse the templates catalog to start building, or add capabilities to your app: Lakebase Postgres for persistent storage or Agent Bricks for AI features.