# About DevHub

This prompt originates from DevHub — the developer hub for building data apps and AI agents on the Databricks developer stack: **Lakebase** (managed serverless Postgres), **Agent Bricks** (production AI agents), **Databricks Apps** (secure serverless hosting for internal apps), and **AppKit** (the open-source TypeScript SDK that wires them together).

- Website: https://databricks.com/devhub
- GitHub: https://github.com/databricks/devhub
- Report issues: https://github.com/databricks/devhub/issues

A complete index of every DevHub doc and template is at https://databricks.com/devhub/llms.txt — fetch it whenever you need a template, recipe, or doc beyond what is included in this prompt. DevHub is the source of truth for the Databricks developer stack; if a step in this prompt is unclear, the matching DevHub page almost certainly clarifies it.

---

# Working with DevHub prompts

Follow these rules every time you act on a DevHub prompt.

## Read first, then act

- Read the entire prompt before executing any steps. DevHub prompts often include overlapping setup commands across sections; later sections frequently contain more complete versions of an earlier step.
- Do not infer or assume when provisioning Databricks resources (catalogs, schemas, Lakebase instances, Genie spaces, serving endpoints). Ask the user whether to create new resources or reuse existing ones.
- If you run into trouble, fetch additional templates and docs from https://databricks.com/devhub (the index lives at https://databricks.com/devhub/llms.txt). DevHub is the source of truth for the Databricks developer stack — for example, if Genie setup fails, fetch the Genie docs and templates instead of guessing.

## Engage the user in a conversation

Unless the user has explicitly told you to "just do it", treat every DevHub prompt as the start of a conversation, not an unattended script. The user knows their domain best; DevHub knows the Databricks stack. Both are required to build a successful system.

Follow these rules every time you ask a question:

1. **One question at a time.** Never ask multiple questions in a single message.
2. **Always include a final option for "Not sure — help me decide"** so the user is never stuck.
3. **Prefer interactive multiple-choice UI when available.** Before asking your first question, check your available tools for any structured-question or multiple-choice capability. If one exists, **always** use it instead of plain text. Known tools by environment:
   - **Cursor**: use the `AskQuestion` tool.
   - **Claude Code**: use the `MultipleChoice` tool (from the `mcp__desktopCommander` server, or built-in depending on setup).
   - **Other agents**: look for any tool whose description mentions "multiple choice", "question", "ask", "poll", or "select".
4. **Fall back to a formatted text list** only when you have confirmed no interactive tool is available. Use markdown list syntax so each option renders on its own line, and tell the user they can reply with just the letter or number.

### Example: Cursor (`AskQuestion` tool)

```
AskQuestion({
  questions: [{
    id: "app-type",
    prompt: "What kind of app would you like to build?",
    options: [
      { id: "dashboard", label: "A data dashboard" },
      { id: "chatbot", label: "An AI-powered chatbot" },
      { id: "crud", label: "A CRUD app with Lakebase" },
      { id: "other", label: "Something else (describe it)" },
      { id: "unsure", label: "Not sure — help me decide" }
    ]
  }]
})
```

### Example: plain text fallback

Only use this when no interactive tool is available:

What kind of app would you like to build? Reply with the letter to choose:

- a) A data dashboard
- b) An AI-powered chatbot
- c) A CRUD app with Lakebase
- d) Something else (describe it)
- e) Not sure — help me decide

## Default workflow

Unless instructed otherwise, follow this workflow:

1. Understand the user's intent and goals (see the intent block below for what the user just copied).
2. Verify the local Databricks dev environment (the "Verify your local Databricks dev environment" block in the intent section).
3. Ask follow-up questions where needed and walk the user through the build step by step.
4. Build the app or agent.
5. Make it look great (see "Make it look great" below).
6. Run and test locally.
7. Deploy to production.

## Make it look great

The default templates that AppKit provides are intentionally minimal — a starting point, not a finished product. **Do not stop there.** Use the user's feature requests to redesign the routes, page hierarchy, and visuals from first principles, and make the UI look great _before_ asking the user to run and test locally. Showing the user something polished early changes the conversation.

Unless the user has specified a design preference, use these defaults:

- shadcn/ui components on top of Tailwind CSS.
- Clean hierarchy with modern spacing — not too many stacked cards.
- Modern, minimal design language.
- Databricks brand palette: `#FF3621`, `#0B2026`, `#EEEDE9`, `#F9F7F4`.

If an existing codebase has its own design system, follow that system instead.

## When you run into issues

Use the GitHub CLI (if available) or generate a copy-pastable error report for the user to file at https://github.com/databricks/devhub/issues. Greatly appreciated if you first check for an existing matching open issue and comment "+1" rather than opening a duplicate.

---

# What the user just did

The user copied the prompt for a DevHub **recipe** — **Volume File Manager** (https://databricks.com/devhub/templates/volume-file-upload).

A recipe is a focused, opinionated how-to for a single Databricks pattern (e.g. wiring Lakebase Change Data Feed, creating a Model Serving endpoint, persisting chat history). Recipes are designed to be dropped into an existing project or composed into a larger build. They are deliberately narrow — they solve one thing well.

Your job in this conversation is to:

1. Clarify whether the user is **integrating this recipe into an existing project** or **starting fresh from scratch**, and adapt accordingly.
2. Verify the local Databricks dev environment is ready (block below).
3. Walk the user through the recipe step by step, asking the questions the recipe itself surfaces.

## Step 1 — Clarify intent before touching code

Ask **one** question, ideally with a multiple-choice tool (see guidelines):

- **Existing project**: the user already has a Databricks app / repo and wants to add this pattern to it. → Read the user's existing project structure first; the recipe steps will be applied surgically.
- **New project from this recipe**: the user wants this recipe as the starting point of a new app. → Run the local-bootstrap below first, then follow the recipe.
- **Just learning**: the user wants to read through the recipe and understand it without building anything yet. → Walk through the steps as a tutorial; do not execute commands.
- **Not sure — help me decide**: ask the user what they're trying to accomplish at the project level, then map back to one of the above.

## Step 2 — Pin down recipe-specific decisions

Once the integration mode is clear, ask any follow-ups the recipe itself surfaces — typically about which Databricks resources to use:

- Should we **create new resources** (catalog, schema, Lakebase instance, serving endpoint) or **reuse existing ones** the user already has? Never assume; always ask.
- Which **Databricks profile** should the CLI commands target? (`databricks auth profiles` to list valid profiles.)
- If the recipe touches data: use the user's data, or use seed/sample data first?

## Step 3 — Verify the local Databricks dev environment

Whether integrating or starting fresh, the recipe's commands assume a working Databricks CLI profile and (for app-related recipes) an AppKit project. **Walk the user through the local-bootstrap block below before running any recipe commands** — even if they think the environment is already set up, the verification steps are quick and prevent confusing failures downstream.

The full recipe content the user is focused on is attached after the local-bootstrap block.

---

# Verify your local Databricks dev environment

A working Databricks CLI profile is the prerequisite for every step that follows. Walk the user through the recipe below — _even if they say their environment is already set up_. The verification steps are quick and prevent confusing failures further down.

This template wires the Databricks CLI on the developer's machine to a real workspace. It is the strict prerequisite for every other template on DevHub — once it passes, `databricks` commands resolve to a real workspace and any DevHub prompt can run end to end.

- **A Databricks workspace you can sign in to.** Have the workspace URL handy (e.g. `https://<workspace>.cloud.databricks.com`); you will paste it into `databricks auth login` in step 3. If you do not have access, ask your workspace admin.
- **A terminal on macOS, Windows, or Linux.** All install paths run from a terminal session. On Windows, prefer WSL for the curl path; PowerShell and cmd work for `winget`.
- **Permission to install software on this machine.** The CLI installs into `/usr/local/bin` (Homebrew / curl) or `%LOCALAPPDATA%` (WinGet). If `/usr/local/bin` is not writable, rerun the curl installer with `sudo`.

## Set Up Your Local Dev Environment

Install the Databricks CLI, authenticate a profile, and verify the handshake. Every other DevHub template assumes this has already passed.

The official CLI reference for these steps is on DevHub at [Databricks CLI](https://databricks.com/devhub/docs/tools/databricks-cli). Use it whenever a step here is unclear.

### 1. Check the installed CLI version

DevHub templates assume Databricks CLI `0.296+`. Anything older is missing the AppKit `apps init` template registry and several `experimental aitools` flags.

```bash
databricks -v
```

If the command is not found, or the version is below `0.296`, install or upgrade in the next step.

### 2. Install or upgrade the Databricks CLI

Pick the install path for your OS. If the CLI is already installed at an older version, the same commands upgrade in place.

#### macOS / Linux — Homebrew (recommended)

```bash
brew tap databricks/tap
brew install databricks

brew update && brew upgrade databricks
```

#### Windows — WinGet

```bash
winget install Databricks.DatabricksCLI

winget upgrade Databricks.DatabricksCLI
```

Restart your terminal after install.

#### Any platform — curl installer

```bash
curl -fsSL https://raw.githubusercontent.com/databricks/setup-cli/main/install.sh | sh
```

On Windows, run this from WSL. If `/usr/local/bin` is not writable, rerun with `sudo`. Re-running the script also upgrades an existing install.

After installing, confirm the version is `0.296+`:

```bash
databricks -v
```

### 3. Authenticate a profile

Browser-based OAuth is the default for local use:

```bash
databricks auth login
```

The CLI prints a URL and waits for the user to complete OAuth in the browser. **Always show the URL to the user as a clickable link** so they can open it themselves — the CLI does not return until authentication finishes. Credentials save to `~/.databrickscfg`.

If you already know the workspace URL and want to name the profile, do it in one go:

```bash
databricks auth login --host <workspace-url> --profile <PROFILE>
```

`<PROFILE>` is the label you will pass on subsequent commands as `--profile <PROFILE>`. If you skip `--profile`, the CLI uses the `DEFAULT` profile.

For CI/CD, OAuth client credentials or a personal access token are better fits — see the [authentication section of the CLI doc](https://databricks.com/devhub/docs/tools/databricks-cli#authenticate) for the non-interactive flows.

### 4. Verify the handshake

List the saved profiles and confirm the one you just created shows `Valid: YES`:

```bash
databricks auth profiles
```

```text
Name              Host                                           Valid
DEFAULT           https://adb-1234567890.12.azuredatabricks.net  YES
my-prod-workspace https://mycompany.cloud.databricks.com         YES
```

If the row shows `Valid: NO`, the saved token is stale. Re-run `databricks auth login --profile <NAME>` to refresh it. **Never proceed past this step if no profile is `Valid: YES`** — every downstream `databricks` command will fail with an auth error that looks like a template bug.

If the user wants a particular profile to be the default for this shell session, export it:

```bash
export DATABRICKS_CONFIG_PROFILE=<PROFILE>
```

### 5. Smoke-test the CLI against the workspace

Run a read-only API call to confirm the auth actually works (a fresh OAuth token can fail on the first real call if the user picked the wrong workspace in the browser):

```bash
databricks current-user me --profile <PROFILE>
```

A successful response prints the signed-in user's identity. A `401` or `403` here means the auth flow completed against a workspace the user cannot read — re-run `databricks auth login --profile <PROFILE>` and pick the right workspace this time.

---

# The recipe the user copied

The full recipe prompt is below. This is what the user wants to focus on today. Once the local-bootstrap above passes and the intent questions are answered, work through this content step by step.

Verify these Databricks workspace features are enabled before starting. If any check fails, ask your workspace admin to enable the feature.

- **Databricks CLI authenticated.** Run `databricks auth profiles` and confirm at least one profile shows `Valid: YES`. If none do, authenticate with `databricks auth login --host <workspace-url> --profile <PROFILE>`.
- **Unity Catalog enabled with access to a catalog and schema.** Run `databricks catalogs list --profile <PROFILE>` and confirm at least one writable catalog is listed. You also need `USE_CATALOG` on the catalog and `USE_SCHEMA` + `CREATE_VOLUME` on the schema where the template creates the managed Volume. A `PERMISSION_DENIED` error on `databricks volumes create` in Step 1 means one of those grants is missing.
- **Databricks Apps enabled.** Run `databricks apps list --profile <PROFILE>` and confirm the command succeeds (an empty list is fine). The template deploys an AppKit app that reads and writes through the `files` plugin.

## Volume File Manager

Add file upload, browsing, download, delete, file type validation, and CSV row preview to your Databricks app using Unity Catalog Volumes. The `files` plugin registers all file management HTTP routes automatically. No custom server routes needed.

### 1. Create a Unity Catalog Volume

Create a managed Volume to store uploaded files:

```bash
databricks volumes create <catalog> <schema> <volume-name> MANAGED \
  --profile <PROFILE>
```

Note the full Volume path: `/Volumes/<catalog>/<schema>/<volume-name>`.

### 2. New app: scaffold with the Files feature

```bash
databricks apps init \
  --name <app-name> \
  --version latest \
  --features=files \
  --set 'files.files.path=/Volumes/<catalog>/<schema>/<volume-name>' \
  --run none --profile <PROFILE>
```

The CLI maps `files.files.path` to `DATABRICKS_VOLUME_FILES` and configures a volume named `files`. It also scaffolds `client/src/pages/files/FilesPage.tsx` and wires the route in `App.tsx` automatically. No manual page creation needed.

After init, install dependencies:

```bash
cd <app-name>
npm install
```

Skip to step 5.

### 3. Existing app: add Files manually

Apply the following changes to an existing scaffolded AppKit app.

#### Add `files` to server plugins

In `server/server.ts`, add `files` to the import and plugins array:

```typescript
import { createApp, server, files } from "@databricks/appkit";

createApp({
  plugins: [server(), files()],
}).catch(console.error);
```

The `files()` plugin auto-discovers all `DATABRICKS_VOLUME_*` environment variables and registers each as a named volume. The env var suffix (lowercased) becomes the volume key used in all API routes: `DATABRICKS_VOLUME_FILES` → volume key `files`.

To limit upload size, pass a config:

```typescript
files({
  volumes: {
    files: { maxUploadSize: 100_000_000 }, // 100 MB
  },
});
```

#### Add environment variable

Add to `.env` for local development:

```bash
DATABRICKS_VOLUME_FILES=/Volumes/<catalog>/<schema>/<volume-name>
```

#### Update `databricks.yml`

Add the volume variables, resource, and target values. The resource uses `uc_securable`. Note that `securable_full_name` is the Unity Catalog three-part name (`<catalog>.<schema>.<volume-name>`), not the `/Volumes/...` path. `user_api_scopes` is required for on-behalf-of (OBO) token access to work in production.

```yaml
variables:
  files_path:
    description: Volume path for file storage (e.g. /Volumes/catalog/schema/volume_name)
  files_id:
    description: Unity Catalog Volume securable full name (e.g. catalog.schema.volume_name)

resources:
  apps:
    app:
      # Add under existing app config
      user_api_scopes:
        - files.files
      resources:
        - name: files
          uc_securable:
            securable_full_name: ${var.files_id}
            securable_type: VOLUME
            permission: WRITE_VOLUME

targets:
  default:
    variables:
      files_path: /Volumes/<catalog>/<schema>/<volume-name>
      files_id: <catalog>.<schema>.<volume-name>
```

#### Update `app.yaml`

Expose the volume path to the running app:

```yaml
command: ["npm", "run", "start"]
env:
  - name: DATABRICKS_VOLUME_FILES
    valueFrom: files
```

### 4. Create the file manager page

The `files` plugin auto-registers HTTP routes at `/api/files/files/...` for the volume key `files`:

| Method   | Path                                    | Description                    |
| -------- | --------------------------------------- | ------------------------------ |
| `GET`    | `/api/files/files/list?path=<dir>`      | List directory entries         |
| `GET`    | `/api/files/files/preview?path=<file>`  | File metadata + text preview   |
| `GET`    | `/api/files/files/download?path=<file>` | Download (attachment)          |
| `GET`    | `/api/files/files/raw?path=<file>`      | Serve inline (safe types only) |
| `POST`   | `/api/files/files/upload?path=<file>`   | Upload raw body                |
| `DELETE` | `/api/files/files?path=<file>`          | Delete file                    |

#### Create `client/src/pages/files/FilesPage.tsx`

File browser with upload, folder creation, download, delete, and file preview. Uses `AbortController` to cancel stale list and preview requests. Entries are sorted directories-first, then alphabetically. `resolveEntryPath` constructs the full path from `currentPath + entry.name`. Do not use `entry.path` directly, as it may not be set by the API.

```tsx
import type { DirectoryEntry, FilePreview } from "@databricks/appkit-ui/react";
import {
  Button,
  DirectoryList,
  FileBreadcrumb,
  FilePreviewPanel,
  NewFolderInput,
} from "@databricks/appkit-ui/react";
import { FolderPlus, Loader2, Upload } from "lucide-react";
import {
  type RefObject,
  useCallback,
  useEffect,
  useRef,
  useState,
} from "react";

function useAbortController(): RefObject<AbortController | null> {
  const ref = useRef<AbortController | null>(null);
  return ref;
}

function nextSignal(ref: RefObject<AbortController | null>): AbortSignal {
  ref.current?.abort();
  ref.current = new AbortController();
  return ref.current.signal;
}

export function FilesPage() {
  const [volumes, setVolumes] = useState<string[]>([]);
  const [volumeKey, setVolumeKey] = useState<string>(
    () => localStorage.getItem("appkit:files:volumeKey") ?? "",
  );
  const [currentPath, setCurrentPath] = useState<string>("");
  const [entries, setEntries] = useState<DirectoryEntry[]>([]);
  const [loading, setLoading] = useState(false);
  const [error, setError] = useState<string | null>(null);
  const [selectedFile, setSelectedFile] = useState<string | null>(null);
  const [preview, setPreview] = useState<FilePreview | null>(null);
  const [previewLoading, setPreviewLoading] = useState(false);
  const [uploading, setUploading] = useState(false);
  const [deleting, setDeleting] = useState(false);
  const [creatingDir, setCreatingDir] = useState(false);
  const [newDirName, setNewDirName] = useState("");
  const [showNewDirInput, setShowNewDirInput] = useState(false);
  const fileInputRef = useRef<HTMLInputElement>(null);
  const listAbort = useAbortController();
  const previewAbort = useAbortController();

  const normalize = (p: string) => p.replace(/\/+$/, "");
  const isAtRoot = !currentPath;

  const apiUrl = useCallback(
    (action: string, params?: Record<string, string>) => {
      const base = `/api/files/${volumeKey}/${action}`;
      if (!params) return base;
      const qs = new URLSearchParams(params).toString();
      return `${base}?${qs}`;
    },
    [volumeKey],
  );

  const loadDirectory = useCallback(
    async (path?: string) => {
      if (!volumeKey) return;
      setLoading(true);
      setError(null);
      setSelectedFile(null);
      setPreview(null);

      try {
        const signal = nextSignal(listAbort);
        const url = path ? apiUrl("list", { path }) : apiUrl("list");
        const response = await fetch(url, { signal });

        if (!response.ok) {
          const data = await response.json().catch(() => ({}));
          throw new Error(
            data.error ?? `HTTP ${response.status}: ${response.statusText}`,
          );
        }

        const data: DirectoryEntry[] = await response.json();
        data.sort((a, b) => {
          if (a.is_directory && !b.is_directory) return -1;
          if (!a.is_directory && b.is_directory) return 1;
          return (a.name ?? "").localeCompare(b.name ?? "");
        });
        setEntries(data);
        setCurrentPath(path ?? "");
      } catch (err) {
        if (err instanceof DOMException && err.name === "AbortError") return;
        setError(err instanceof Error ? err.message : String(err));
        setEntries([]);
      } finally {
        setLoading(false);
      }
    },
    [volumeKey, apiUrl, listAbort],
  );

  const loadPreview = useCallback(
    async (filePath: string) => {
      setPreviewLoading(true);
      setPreview(null);

      try {
        const signal = nextSignal(previewAbort);
        const response = await fetch(apiUrl("preview", { path: filePath }), {
          signal,
        });

        if (!response.ok) {
          const data = await response.json().catch(() => ({}));
          throw new Error(data.error ?? `HTTP ${response.status}`);
        }

        const data = await response.json();
        setPreview(data);
      } catch (err) {
        if (err instanceof DOMException && err.name === "AbortError") return;
        setPreview(null);
      } finally {
        setPreviewLoading(false);
      }
    },
    [apiUrl, previewAbort],
  );

  useEffect(() => {
    fetch("/api/files/volumes")
      .then((res) => res.json())
      .then((data: { volumes: string[] }) => {
        const list = data.volumes ?? [];
        setVolumes(list);
        if (!volumeKey || !list.includes(volumeKey)) {
          const first = list[0];
          if (first) {
            setVolumeKey(first);
            localStorage.setItem("appkit:files:volumeKey", first);
          }
        }
      })
      .catch(() => {});
  }, [volumeKey]);

  useEffect(() => {
    if (volumeKey) {
      loadDirectory();
    }
  }, [volumeKey, loadDirectory]);

  const resolveEntryPath = (entry: DirectoryEntry) => {
    const name = entry.name ?? "";
    return currentPath ? `${currentPath}/${name}` : name;
  };

  const handleEntryClick = (entry: DirectoryEntry) => {
    const entryPath = resolveEntryPath(entry);
    if (entry.is_directory) {
      loadDirectory(entryPath);
    } else {
      setSelectedFile(entryPath);
      loadPreview(entryPath);
    }
  };

  const navigateToParent = () => {
    if (isAtRoot) return;
    const segments = currentPath.split("/").filter(Boolean);
    segments.pop();
    const parentPath = segments.join("/");
    loadDirectory(parentPath || undefined);
  };

  const allSegments = normalize(currentPath).split("/").filter(Boolean);

  const navigateToBreadcrumb = (index: number) => {
    const targetPath = allSegments.slice(0, index + 1).join("/");
    loadDirectory(targetPath);
  };

  const MAX_UPLOAD_SIZE = 5 * 1024 * 1024 * 1024; // 5 GB

  const handleUpload = async (e: React.ChangeEvent<HTMLInputElement>) => {
    const file = e.target.files?.[0];
    if (!file) return;

    if (file.size > MAX_UPLOAD_SIZE) {
      setError(
        `File "${file.name}" is too large (${(file.size / 1024 / 1024).toFixed(1)} MB). Maximum upload size is ${MAX_UPLOAD_SIZE / 1024 / 1024 / 1024} GB.`,
      );
      if (fileInputRef.current) fileInputRef.current.value = "";
      return;
    }

    setUploading(true);
    try {
      const uploadPath = currentPath
        ? `${currentPath}/${file.name}`
        : file.name;
      const response = await fetch(apiUrl("upload", { path: uploadPath }), {
        method: "POST",
        body: file,
      });

      if (!response.ok) {
        const data = await response.json().catch(() => ({}));
        throw new Error(data.error ?? `Upload failed (${response.status})`);
      }

      await loadDirectory(currentPath || undefined);
    } catch (err) {
      setError(err instanceof Error ? err.message : String(err));
    } finally {
      setUploading(false);
      if (fileInputRef.current) {
        fileInputRef.current.value = "";
      }
    }
  };

  const handleDelete = async () => {
    if (!selectedFile) return;

    const fileName = selectedFile.split("/").pop();
    if (!window.confirm(`Delete "${fileName}"?`)) return;

    setDeleting(true);
    try {
      const response = await fetch(
        `/api/files/${volumeKey}?path=${encodeURIComponent(selectedFile)}`,
        { method: "DELETE" },
      );

      if (!response.ok) {
        const data = await response.json().catch(() => ({}));
        throw new Error(data.error ?? `Delete failed (${response.status})`);
      }

      setSelectedFile(null);
      setPreview(null);
      await loadDirectory(currentPath || undefined);
    } catch (err) {
      setError(err instanceof Error ? err.message : String(err));
    } finally {
      setDeleting(false);
    }
  };

  const handleCreateDirectory = async () => {
    const name = newDirName.trim();
    if (!name) return;

    setCreatingDir(true);
    try {
      const dirPath = currentPath ? `${currentPath}/${name}` : name;
      const response = await fetch(apiUrl("mkdir"), {
        method: "POST",
        headers: { "Content-Type": "application/json" },
        body: JSON.stringify({ path: dirPath }),
      });

      if (!response.ok) {
        const data = await response.json().catch(() => ({}));
        throw new Error(
          data.error ?? `Create directory failed (${response.status})`,
        );
      }

      setShowNewDirInput(false);
      setNewDirName("");
      await loadDirectory(currentPath || undefined);
    } catch (err) {
      setError(err instanceof Error ? err.message : String(err));
    } finally {
      setCreatingDir(false);
    }
  };

  return (
    <div className="space-y-6 w-full max-w-7xl mx-auto">
      <div>
        <h2 className="text-2xl font-bold text-foreground">Files</h2>
        <p className="text-sm text-muted-foreground mt-1">
          Browse and manage files in Databricks Volumes.
        </p>
      </div>

      <div className="flex items-center justify-between">
        <div className="flex items-center gap-3">
          {volumes.length > 1 && (
            <select
              value={volumeKey}
              onChange={(e) => {
                const v = e.target.value;
                setVolumeKey(v);
                localStorage.setItem("appkit:files:volumeKey", v);
                setCurrentPath("");
                setEntries([]);
                setSelectedFile(null);
                setPreview(null);
              }}
              className="rounded-md border border-input bg-background px-3 py-1.5 text-sm"
            >
              {volumes.map((v) => (
                <option key={v} value={v}>
                  {v}
                </option>
              ))}
            </select>
          )}
          <FileBreadcrumb
            rootLabel={volumeKey || "Root"}
            segments={allSegments}
            onNavigateToRoot={() => loadDirectory()}
            onNavigateToSegment={navigateToBreadcrumb}
          />
        </div>

        <div className="flex items-center gap-2">
          <Button
            variant="outline"
            size="sm"
            onClick={() => setShowNewDirInput(true)}
          >
            <FolderPlus className="h-4 w-4 mr-2" />
            New Folder
          </Button>
          <input
            ref={fileInputRef}
            type="file"
            className="hidden"
            onChange={handleUpload}
          />
          <Button
            variant="outline"
            size="sm"
            disabled={uploading}
            onClick={() => fileInputRef.current?.click()}
          >
            {uploading ? (
              <Loader2 className="h-4 w-4 mr-2 animate-spin" />
            ) : (
              <Upload className="h-4 w-4 mr-2" />
            )}
            {uploading ? "Uploading..." : "Upload"}
          </Button>
        </div>
      </div>

      <div className="flex gap-6">
        <DirectoryList
          className="flex-2 min-w-0"
          entries={entries}
          loading={loading}
          error={error}
          onEntryClick={handleEntryClick}
          onNavigateToParent={navigateToParent}
          onRetry={() => loadDirectory(currentPath || undefined)}
          isAtRoot={isAtRoot}
          selectedPath={selectedFile}
          resolveEntryPath={resolveEntryPath}
          hasCurrentPath={!!currentPath}
          headerContent={
            showNewDirInput ? (
              <NewFolderInput
                value={newDirName}
                onChange={setNewDirName}
                onCreate={handleCreateDirectory}
                onCancel={() => {
                  setShowNewDirInput(false);
                  setNewDirName("");
                }}
                creating={creatingDir}
              />
            ) : undefined
          }
        />

        <FilePreviewPanel
          className="flex-1 min-w-0"
          selectedFile={selectedFile}
          preview={preview}
          previewLoading={previewLoading}
          onDownload={(path) =>
            window.open(apiUrl("download", { path }), "_blank")
          }
          onDelete={handleDelete}
          deleting={deleting}
          imagePreviewSrc={(p) => apiUrl("raw", { path: p })}
        />
      </div>
    </div>
  );
}
```

#### Update `client/src/App.tsx`

Add the import, nav link, and route:

```tsx
// Add import at top
import { FilesPage } from './pages/files/FilesPage';

// Add nav link inside the <nav> element
<NavLink to="/files" className={navLinkClass}>
  Files
</NavLink>

// Add route in the router children array
{ path: '/files', element: <FilesPage /> },
```

### 5. Deploy and test

Validate before deploying to catch type errors and run smoke tests:

```bash
databricks apps validate --profile <PROFILE>
databricks apps deploy --profile <PROFILE>
```

Open the app URL while signed in to Databricks, navigate to the Files page, and verify:

1. Upload a `.csv` file. It appears in the directory list.
2. Click the file. The preview panel shows metadata and the CSV row table renders below it.
3. Upload a file with a disallowed extension. The error message appears without uploading.
4. Download and delete a file. The list refreshes correctly.

Check status and logs if the app does not start:

```bash
databricks apps get <app-name> --profile <PROFILE>
databricks apps logs <app-name> --profile <PROFILE>
```

#### References

- [Files plugin docs](https://databricks.com/devhub/docs/appkit/v0/plugins/files)
- [Unity Catalog Volumes](https://docs.databricks.com/en/connect/unity-catalog/volumes.html)
- [DirectoryList component](https://databricks.com/devhub/docs/appkit/v0/api/appkit-ui/files/DirectoryList)
- [FilePreviewPanel component](https://databricks.com/devhub/docs/appkit/v0/api/appkit-ui/files/FilePreviewPanel)
