# About DevHub

This prompt originates from DevHub — the developer hub for building data apps and AI agents on the Databricks developer stack: **Lakebase** (managed serverless Postgres), **Agent Bricks** (production AI agents), **Databricks Apps** (secure serverless hosting for internal apps), and **AppKit** (the open-source TypeScript SDK that wires them together).

- Website: https://databricks.com/devhub
- GitHub: https://github.com/databricks/devhub
- Report issues: https://github.com/databricks/devhub/issues

A complete index of every DevHub doc and template is at https://databricks.com/devhub/llms.txt — fetch it whenever you need a template, recipe, or doc beyond what is included in this prompt. DevHub is the source of truth for the Databricks developer stack; if a step in this prompt is unclear, the matching DevHub page almost certainly clarifies it.

---

# Working with DevHub prompts

Follow these rules every time you act on a DevHub prompt.

## Read first, then act

- Read the entire prompt before executing any steps. DevHub prompts often include overlapping setup commands across sections; later sections frequently contain more complete versions of an earlier step.
- Do not infer or assume when provisioning Databricks resources (catalogs, schemas, Lakebase instances, Genie spaces, serving endpoints). Ask the user whether to create new resources or reuse existing ones.
- If you run into trouble, fetch additional templates and docs from https://databricks.com/devhub (the index lives at https://databricks.com/devhub/llms.txt). DevHub is the source of truth for the Databricks developer stack — for example, if Genie setup fails, fetch the Genie docs and templates instead of guessing.

## Engage the user in a conversation

Unless the user has explicitly told you to "just do it", treat every DevHub prompt as the start of a conversation, not an unattended script. The user knows their domain best; DevHub knows the Databricks stack. Both are required to build a successful system.

Follow these rules every time you ask a question:

1. **One question at a time.** Never ask multiple questions in a single message.
2. **Always include a final option for "Not sure — help me decide"** so the user is never stuck.
3. **Prefer interactive multiple-choice UI when available.** Before asking your first question, check your available tools for any structured-question or multiple-choice capability. If one exists, **always** use it instead of plain text. Known tools by environment:
   - **Cursor**: use the `AskQuestion` tool.
   - **Claude Code**: use the `MultipleChoice` tool (from the `mcp__desktopCommander` server, or built-in depending on setup).
   - **Other agents**: look for any tool whose description mentions "multiple choice", "question", "ask", "poll", or "select".
4. **Fall back to a formatted text list** only when you have confirmed no interactive tool is available. Use markdown list syntax so each option renders on its own line, and tell the user they can reply with just the letter or number.

### Example: Cursor (`AskQuestion` tool)

```
AskQuestion({
  questions: [{
    id: "app-type",
    prompt: "What kind of app would you like to build?",
    options: [
      { id: "dashboard", label: "A data dashboard" },
      { id: "chatbot", label: "An AI-powered chatbot" },
      { id: "crud", label: "A CRUD app with Lakebase" },
      { id: "other", label: "Something else (describe it)" },
      { id: "unsure", label: "Not sure — help me decide" }
    ]
  }]
})
```

### Example: plain text fallback

Only use this when no interactive tool is available:

What kind of app would you like to build? Reply with the letter to choose:

- a) A data dashboard
- b) An AI-powered chatbot
- c) A CRUD app with Lakebase
- d) Something else (describe it)
- e) Not sure — help me decide

## Default workflow

Unless instructed otherwise, follow this workflow:

1. Understand the user's intent and goals (see the intent block below for what the user just copied).
2. Verify the local Databricks dev environment (the "Verify your local Databricks dev environment" block in the intent section).
3. Ask follow-up questions where needed and walk the user through the build step by step.
4. Build the app or agent.
5. Make it look great (see "Make it look great" below).
6. Run and test locally.
7. Deploy to production.

## Make it look great

The default templates that AppKit provides are intentionally minimal — a starting point, not a finished product. **Do not stop there.** Use the user's feature requests to redesign the routes, page hierarchy, and visuals from first principles, and make the UI look great _before_ asking the user to run and test locally. Showing the user something polished early changes the conversation.

Unless the user has specified a design preference, use these defaults:

- shadcn/ui components on top of Tailwind CSS.
- Clean hierarchy with modern spacing — not too many stacked cards.
- Modern, minimal design language.
- Databricks brand palette: `#FF3621`, `#0B2026`, `#EEEDE9`, `#F9F7F4`.

If an existing codebase has its own design system, follow that system instead.

## When you run into issues

Use the GitHub CLI (if available) or generate a copy-pastable error report for the user to file at https://github.com/databricks/devhub/issues. Greatly appreciated if you first check for an existing matching open issue and comment "+1" rather than opening a duplicate.

---

# What the user just did

The user copied the prompt for a DevHub **cookbook** — **Genie Analytics App** (https://databricks.com/devhub/templates/genie-analytics-app).

A cookbook is a step-by-step pattern guide that walks the user through building an **archetype application** end-to-end on Databricks. Cookbooks are composed from multiple recipes — they show how the recipes fit together into a working app (e.g. an AI chat app with persistence, a Lakebase-backed CRUD app, a RAG chat app). The cookbook is the recommended starting point when the user wants the whole archetype, not just one piece.

Your job in this conversation is to:

1. Clarify the user's **goal for this archetype** — production app, learning project, or demo.
2. Verify the local Databricks dev environment is ready (block below).
3. Walk the user through the cookbook section by section, asking the questions each section surfaces, and stitching the included recipes together coherently.

## Step 1 — Clarify intent before touching code

Ask **one** question, ideally with a multiple-choice tool:

- **New project from scratch** following this archetype end-to-end. → Run the local-bootstrap below, then scaffold a fresh project and walk through the cookbook step by step.
- **Add this archetype to an existing Databricks app**. → Read the user's existing project first; introduce the archetype's pieces incrementally without breaking what's there.
- **Just learning the pattern**: the user wants to understand the archetype before deciding to build it. → Walk through the steps as a guided tour; do not execute commands.
- **Not sure — help me decide**: ask follow-ups about the user's end goal (who uses the app, what data, deployed where) and map back to one of the above.

## Step 2 — Pin down archetype-specific decisions

Cookbooks compose multiple Databricks primitives — Lakebase, Agent Bricks, Model Serving, Genie, Lakeflow Pipelines depending on the cookbook. Before generating code, ask:

- For each primitive the cookbook needs: **create new** or **reuse existing**? Never assume — Lakebase instances, Model Serving endpoints, and Genie spaces all cost money and take minutes to provision.
- Which **Databricks profile** to target? (`databricks auth profiles`.)
- **Data**: real data from the user's Unity Catalog, or seed data to start and swap later?
- **Scope today**: ship the full archetype, or stop after a working slice (e.g. just the Lakebase + UI layer, no AI yet)?

## Step 3 — Verify the local Databricks dev environment

Cookbooks run multiple `databricks` and AppKit CLI commands across their steps; a misconfigured CLI profile fails immediately and looks like a cookbook bug. **Walk the user through the local-bootstrap block below first**, even if they say their environment is already set up.

The full cookbook content the user is focused on is attached after the local-bootstrap block.

---

# Verify your local Databricks dev environment

A working Databricks CLI profile is the prerequisite for every step that follows. Walk the user through the recipe below — _even if they say their environment is already set up_. The verification steps are quick and prevent confusing failures further down.

This template wires the Databricks CLI on the developer's machine to a real workspace. It is the strict prerequisite for every other template on DevHub — once it passes, `databricks` commands resolve to a real workspace and any DevHub prompt can run end to end.

- **A Databricks workspace you can sign in to.** Have the workspace URL handy (e.g. `https://<workspace>.cloud.databricks.com`); you will paste it into `databricks auth login` in step 3. If you do not have access, ask your workspace admin.
- **A terminal on macOS, Windows, or Linux.** All install paths run from a terminal session. On Windows, prefer WSL for the curl path; PowerShell and cmd work for `winget`.
- **Permission to install software on this machine.** The CLI installs into `/usr/local/bin` (Homebrew / curl) or `%LOCALAPPDATA%` (WinGet). If `/usr/local/bin` is not writable, rerun the curl installer with `sudo`.

## Set Up Your Local Dev Environment

Install the Databricks CLI, authenticate a profile, and verify the handshake. Every other DevHub template assumes this has already passed.

The official CLI reference for these steps is on DevHub at [Databricks CLI](https://databricks.com/devhub/docs/tools/databricks-cli). Use it whenever a step here is unclear.

### 1. Check the installed CLI version

DevHub templates assume Databricks CLI `0.296+`. Anything older is missing the AppKit `apps init` template registry and several `experimental aitools` flags.

```bash
databricks -v
```

If the command is not found, or the version is below `0.296`, install or upgrade in the next step.

### 2. Install or upgrade the Databricks CLI

Pick the install path for your OS. If the CLI is already installed at an older version, the same commands upgrade in place.

#### macOS / Linux — Homebrew (recommended)

```bash
brew tap databricks/tap
brew install databricks

brew update && brew upgrade databricks
```

#### Windows — WinGet

```bash
winget install Databricks.DatabricksCLI

winget upgrade Databricks.DatabricksCLI
```

Restart your terminal after install.

#### Any platform — curl installer

```bash
curl -fsSL https://raw.githubusercontent.com/databricks/setup-cli/main/install.sh | sh
```

On Windows, run this from WSL. If `/usr/local/bin` is not writable, rerun with `sudo`. Re-running the script also upgrades an existing install.

After installing, confirm the version is `0.296+`:

```bash
databricks -v
```

### 3. Authenticate a profile

Browser-based OAuth is the default for local use:

```bash
databricks auth login
```

The CLI prints a URL and waits for the user to complete OAuth in the browser. **Always show the URL to the user as a clickable link** so they can open it themselves — the CLI does not return until authentication finishes. Credentials save to `~/.databrickscfg`.

If you already know the workspace URL and want to name the profile, do it in one go:

```bash
databricks auth login --host <workspace-url> --profile <PROFILE>
```

`<PROFILE>` is the label you will pass on subsequent commands as `--profile <PROFILE>`. If you skip `--profile`, the CLI uses the `DEFAULT` profile.

For CI/CD, OAuth client credentials or a personal access token are better fits — see the [authentication section of the CLI doc](https://databricks.com/devhub/docs/tools/databricks-cli#authenticate) for the non-interactive flows.

### 4. Verify the handshake

List the saved profiles and confirm the one you just created shows `Valid: YES`:

```bash
databricks auth profiles
```

```text
Name              Host                                           Valid
DEFAULT           https://adb-1234567890.12.azuredatabricks.net  YES
my-prod-workspace https://mycompany.cloud.databricks.com         YES
```

If the row shows `Valid: NO`, the saved token is stale. Re-run `databricks auth login --profile <NAME>` to refresh it. **Never proceed past this step if no profile is `Valid: YES`** — every downstream `databricks` command will fail with an auth error that looks like a template bug.

If the user wants a particular profile to be the default for this shell session, export it:

```bash
export DATABRICKS_CONFIG_PROFILE=<PROFILE>
```

### 5. Smoke-test the CLI against the workspace

Run a read-only API call to confirm the auth actually works (a fresh OAuth token can fail on the first real call if the user picked the wrong workspace in the browser):

```bash
databricks current-user me --profile <PROFILE>
```

A successful response prints the signed-in user's identity. A `401` or `403` here means the auth flow completed against a workspace the user cannot read — re-run `databricks auth login --profile <PROFILE>` and pick the right workspace this time.

---

# The cookbook the user copied

The full cookbook prompt is below. This is what the user wants to focus on today. Once the local-bootstrap above passes and the intent questions are answered, work through this content step by step.

---
title: "Genie Analytics App"
summary: "Build a minimal Databricks App with AI/BI Genie conversational analytics. Covers Genie space configuration, plugin wiring, and deploy."
---

# Genie Analytics App

Build a minimal Databricks App with AI/BI Genie conversational analytics. Covers Genie space configuration, plugin wiring, and deploy.

## Prerequisites



### Genie Conversational Analytics

Verify these Databricks workspace features are enabled before starting. If any check fails, ask your workspace admin to enable the feature.

- **Databricks CLI authenticated.** Run `databricks auth profiles` and confirm at least one profile shows `Valid: YES`. If none do, authenticate with `databricks auth login --host <workspace-url> --profile <PROFILE>`.
- **AI/BI Genie enabled.** Run `databricks genie list-spaces --profile <PROFILE>` and confirm the command succeeds. A `not found` or permission error means Genie is not available to this identity.
- **At least one Genie space configured.** The command above must return at least one space; you will use its `space_id` below. If none exist, open your Databricks workspace, navigate to **AI/BI Genie**, and create a space connected to the data tables you want to query.
- **Databricks Apps enabled.** Run `databricks apps list --profile <PROFILE>` and confirm the command succeeds (an empty list is fine). The template deploys an AppKit app that hosts the Genie chat UI.

## Genie Conversational Analytics

Embed a Databricks AI/BI Genie chat interface so users can explore data through natural language. Configure a Genie space, wire up the server and client plugins, declare app resources, and deploy.

> **Choose your path:**
>
> - **New app** — follow steps 1 → 2 → 8.
> - **Adding Genie to an existing AppKit app** — follow steps 1 → 3 → 4 → 5 → 6 → 7 → 8.

### 1. Create a Genie space and set your profile

Open your Databricks workspace, navigate to **AI/BI Genie**, and create a new Genie space connected to your data tables.

List your spaces to get the `space_id`:

```bash
databricks genie list-spaces -o json --profile <PROFILE>
```

Use the `space_id` value wherever a space ID is required (scaffold `--set`, `.env`, and `databricks.yml`).

**Tip: Avoid repeating `--profile` on every command**

Add your profile to the bundle's `databricks.yml` under the target — then `bundle deploy` and `apps` commands pick it up automatically:

```yaml
targets:
  default:
    workspace:
      profile: <PROFILE>
```

This is more reliable than `export DATABRICKS_CONFIG_PROFILE` since it persists across shells and works for agents running commands in subshells.

### 2. New app: scaffold with the Genie feature

If you are starting a new app, scaffold it with the Genie feature flag. This generates all server, client, resource, and environment wiring automatically.

Run this from a neutral directory (not inside another app folder) — `apps init` creates the project folder in your current working directory:

```bash
databricks apps init \
  --name <app-name> \
  --version latest \
  --features=genie \
  --set 'genie.genie-space.id=<your-space-id>' \
  --run none
```

`--run none` skips launching the app locally after scaffolding. Use the `space_id` from step 1 for `<your-space-id>`.

**App name:** Use at most 26 characters, **lowercase letters, digits, and hyphens only** (no underscores). Example: `my-genie-app`, not `my_genie_app`.

**Warning: Fix generated `databricks.yml` before deploying**

The scaffold generates a `genie_space_name` variable and references it as `name: ${var.genie_space_name}`, but never assigns a value. `bundle deploy` will fail with _no value assigned to required variable genie_space_name_.

Your `variables:` block should look like this after the fix — only `genie_space_id`, no `genie_space_name`:

```yaml
variables:
  genie_space_id:
    description: Default Genie Space ID
```

And the `genie_space` resource block should use a hardcoded label:

```yaml
genie_space:
  name: genie-space
  space_id: ${var.genie_space_id}
  permission: CAN_RUN
```

The `name: genie-space` is an internal label used by `app.yaml` (`valueFrom: genie-space`), not the Genie space display title.

Skip to step 8 to deploy.

---

### 3. Existing app: add Genie server plugin

The following steps match what `apps init --features=genie` generates. Apply them to an existing scaffolded AppKit app.

> **Tip:** The code below may be outdated. To get the latest, clone `https://github.com/databricks/appkit` and look in the `template/` directory. Search for `{{if .plugins.genie}}` to find all genie-conditional files and blocks.

In `server/server.ts`, add `genie` to the import and plugins array:

```typescript
import { createApp, server, genie } from "@databricks/appkit";

createApp({
  plugins: [server(), genie()],
}).catch(console.error);
```

The `genie()` plugin reads `DATABRICKS_GENIE_SPACE_ID` (the **space ID**, not the display name) from the environment and registers it under the `default` alias. To register multiple spaces, pass explicit aliases:

```typescript
genie({
  spaces: {
    sales: "<sales-space-id>",
    support: "<support-space-id>",
  },
});
```

### 4. Existing app: create the Genie page component

Create `client/src/pages/genie/GeniePage.tsx`:

```tsx
import { GenieChat } from "@databricks/appkit-ui/react";

export function GeniePage() {
  return (
    <div className="space-y-6 w-full max-w-4xl mx-auto">
      <div>
        <h2 className="text-2xl font-bold text-foreground">Genie</h2>
        <p className="text-sm text-muted-foreground mt-1">
          Ask questions about your data using Databricks AI/BI Genie.
        </p>
      </div>
      <div className="h-[600px] border rounded-lg overflow-hidden">
        <GenieChat alias="default" />
      </div>
    </div>
  );
}
```

The `alias` prop must match a key in the server-side `spaces` configuration. When using the default single-space setup, use `"default"`.

For custom chat UIs, use the `useGenieChat` hook instead of the `GenieChat` component:

```tsx
import { useGenieChat } from "@databricks/appkit-ui/react";

function CustomGenieChat() {
  const { messages, status, sendMessage, reset } = useGenieChat({
    alias: "default",
  });
  // Build your own UI with messages, status, sendMessage, and reset
}
```

### 5. Existing app: add the route

In `client/src/App.tsx`, add the import, nav link, and route:

```tsx
import { GeniePage } from "./pages/genie/GeniePage";

// Add inside the <nav> element
<NavLink to="/genie" className={navLinkClass}>
  Genie
</NavLink>

// Add in the router children array
{ path: "/genie", element: <GeniePage /> },
```

### 6. Existing app: declare the Genie resource in `databricks.yml`

The Genie space must be declared as an app resource with the `dashboards.genie` API scope. Without the scope, on-behalf-of user execution fails at runtime.

Add the `genie_space_id` variable, the `user_api_scopes`, and the genie resource under your app. The `name: genie-space` on the resource is the key that `app.yaml` references via `valueFrom`:

```yaml
variables:
  genie_space_id:
    description: Genie space ID (from list-spaces or About)

resources:
  apps:
    app:
      # Merge into your existing app config
      user_api_scopes:
        - dashboards.genie
      resources:
        - name: genie-space
          genie_space:
            name: genie-space
            space_id: ${var.genie_space_id}
            permission: CAN_RUN

targets:
  default:
    variables:
      genie_space_id: <your-space-id>
```

### 7. Existing app: map the environment variable in `app.yaml`

Add the Genie space ID mapping so the deployed app receives the value at runtime:

```yaml
env:
  - name: DATABRICKS_GENIE_SPACE_ID
    valueFrom: genie-space
```

The `valueFrom` value must match the resource `name` in `databricks.yml`.

For local development, add the space ID to `.env`:

```bash
DATABRICKS_GENIE_SPACE_ID=<your-space-id>
```

---

### 8. Deploy and verify

From inside the app project folder (the directory containing `databricks.yml`):

```bash
cd <app-name>

# Build the client
npm run build

# Deploy bundle resources and sync files to workspace
# Copy the upload path printed in the output — you'll need it below
databricks bundle deploy

# Put the app in RUNNING state and wait for compute to be ready
# The loop polls every 5 seconds — press Ctrl+C if it hangs more than 2 minutes
databricks apps start <app-name>
until databricks apps get <app-name> -o json | grep -q '"ACTIVE"'; do sleep 5; done

# First deploy requires --source-code-path: paste the path from bundle deploy output above
databricks apps deploy <app-name> \
  --source-code-path <path-from-bundle-deploy-output>
```

`bundle deploy` prints the workspace upload path (`Uploading bundle files to ...`) — copy that value for `--source-code-path`. `apps start` puts the app into RUNNING state; the `until` loop waits for compute to be ACTIVE. `apps deploy` deploys the source and starts the app server.

For subsequent deploys, `--source-code-path` is not needed — the app remembers the path:

```bash
npm run build
databricks bundle deploy
databricks apps deploy <app-name>
```

Check app status and get the URL:

```bash
databricks apps get <app-name>
```

Open `<app-url>/genie` while signed in to Databricks and ask a natural-language question about your data to verify the integration.

If compute is **STOPPED**, run `databricks apps start <app-name>` and wait for `compute_status.state: ACTIVE` before deploying.

### 9. Troubleshoot common issues

**Missing genie scope error.** If the app logs show `does not have required scopes: genie`, confirm `user_api_scopes` includes `dashboards.genie` in `databricks.yml` and redeploy. Users who authenticated before the scope was added may need to re-authorize the app.

**Genie space not found.** Verify the space ID matches the value on the Genie space **About** tab. Confirm the target variable in `databricks.yml` is set to the correct ID.

**`valueFrom` mismatch.** The `valueFrom` value in `app.yaml` must exactly match the resource `name` in `databricks.yml`. A mismatch causes `DATABRICKS_GENIE_SPACE_ID` to be empty at runtime.

#### References

- [Genie plugin docs](https://databricks.com/devhub/docs/appkit/v0/plugins/genie)
- [AI/BI Genie documentation](https://docs.databricks.com/en/genie/index.html)
- [GenieChat component API](https://databricks.com/devhub/docs/appkit/v0/api/appkit-ui/genie/GenieChat)
