# Databricks Developer Hub > Documentation, templates, and examples for building apps and AI agents on Databricks using Lakebase (managed Postgres), Model Serving, and Databricks Apps. ## Start Here Site orientation: what DevHub is, how to use templates and examples, and where to find companion docs. - [Start here](https://databricks.com/devhub/docs/start-here.md): This site is for building internal apps on Databricks. Pick a template, scaffold with [AppKit](https://databricks.com/devhub/docs/appkit/v0), deploy, and iterate. ## Agent Bricks Connect Agent Bricks agents, governed LLM endpoints, and Genie spaces to your AppKit app. Covers AI Gateway, the Model Serving plugin for calling LLM and agent endpoints, and the Genie plugin for natural-language data queries. - [What is Agent Bricks?](https://databricks.com/devhub/docs/agents/overview.md): **Agent Bricks** is Databricks' enterprise agent platform for building, deploying, and governing agents that operate on your business data. It unifies model access, execution, governance, and context across a single system: from the model you call, to the data your agent reads, to the identity it acts under. In your workspace you configure Knowledge Assistants, Multi-Agent Supervisors, and custom Python agents. Databricks handles evaluation, tuning, and quality improvement, then hosts each agent at an HTTP endpoint your app can call. - [AI Gateway](https://databricks.com/devhub/docs/agents/ai-gateway.md): **AI Gateway** is a Databricks governance layer for LLM endpoints and MCP servers. It tracks usage, enforces rate limits, logs payloads, filters unsafe content and PII, and attributes cost. See the [AI Gateway overview](https://docs.databricks.com/aws/en/ai-gateway/) for a full product introduction. From your AppKit app, you call a governed endpoint with the Model Serving plugin. This page covers the AppKit wiring, the governance features, and the CLI for inspecting and provisioning endpoints. - [Genie spaces](https://databricks.com/devhub/docs/agents/genie.md): Give your users a chat box that queries your data. No text-to-SQL, no schema mapping, no custom LLM. A **Genie space** is a Databricks natural-language interface over Unity Catalog tables: curated datasets plus a knowledge store (synonyms, example SQL, column descriptions) plus a compound AI system that turns questions into SQL. Your AppKit app wires it in with one plugin on the server and one component on the page. - [Custom agent endpoints](https://databricks.com/devhub/docs/agents/custom-agents.md): When your AppKit app needs more than a foundation model response or a Genie-style data query, you call a **custom agent**: an LLM shaped by instructions, tools, document grounding, or multi-agent orchestration. On Databricks, custom agents deploy as Model Serving endpoints, so the Model Serving plugin calls them like any foundation model. ## Apps Host and operate web applications as managed Databricks workspace resources. - [What is Databricks Apps?](https://databricks.com/devhub/docs/apps/overview.md): Databricks Apps hosts your web app inside your workspace. It gets a fixed URL, built-in OAuth, and direct access to your workspace data and services. No separate hosting service, no auth layer to build, no credential rotation to manage. - [Quickstart](https://databricks.com/devhub/docs/apps/quickstart.md): - Databricks CLI `v0.296+` with an [authenticated profile](https://databricks.com/devhub/docs/tools/databricks-cli#authenticate) - Node.js 22+ (AppKit apps are Node/TypeScript) - Databricks workspace with Apps enabled - [App configuration](https://databricks.com/devhub/docs/apps/configuration.md): Two files control how your AppKit app starts and what it connects to: `app.yaml` (runtime behavior and environment variables) and `databricks.yml` (Databricks resources). Each app gets a fixed URL assigned at creation. It cannot change. - [App development](https://databricks.com/devhub/docs/apps/development.md): This page is the CLI and workflow reference for Databricks Apps and AppKit. It covers adding plugins, scaffolding, deploying, managing, and troubleshooting your app. ## Lakebase Managed PostgreSQL for operational workloads with Databricks-native governance and Delta Lake sync. - [Quickstart](https://databricks.com/devhub/docs/lakebase/quickstart.md): - Databricks CLI `v0.296+` with an [authenticated profile](https://databricks.com/devhub/docs/tools/databricks-cli#authenticate) - `psql` (PostgreSQL client) if using `databricks psql`. Alternatively, use [`generate-database-credential`](https://databricks.com/devhub/docs/lakebase/development#local-database-access) with any PostgreSQL client. - Workspace with Lakebase Postgres access enabled - [Lakebase Postgres configuration](https://databricks.com/devhub/docs/lakebase/configuration.md): AppKit connects to Lakebase Postgres using a `postgres` resource declared in `databricks.yml` and `LAKEBASE_ENDPOINT` set in `app.yaml`. - [Lakebase Postgres development](https://databricks.com/devhub/docs/lakebase/development.md): The `lakebase()` plugin provides a standard `pg.Pool` with automatic OAuth token refresh. Once registered, access it via `AppKit.lakebase`: ## AppKit TypeScript SDK for building full-stack Databricks Apps with plugin-based architecture, type-safe data access, and pre-built UI components. - [v0](https://databricks.com/devhub/docs/appkit/v0.md): Learn how to get started with AppKit. - [plugins](https://databricks.com/devhub/docs/appkit/v0/plugins.md): Plugins are modular extensions that add capabilities to your AppKit application. They follow a defined lifecycle and have access to shared services like caching, telemetry, and streaming. ## Tools CLI, SDKs, agent skills, and MCP integrations for Databricks developer workflows. - [Databricks CLI](https://databricks.com/devhub/docs/tools/databricks-cli.md): The Databricks CLI is required for every template on this site, which includes CLI commands for scaffolding and deployment. Install and authenticate before you begin. - [Agent skills](https://databricks.com/devhub/docs/tools/ai-tools/agent-skills.md): Agent skills are task-specific instruction files that AI coding assistants load to perform Databricks development tasks. The Databricks skills live in [databricks/databricks-agent-skills](https://github.com/databricks/databricks-agent-skills) and follow the open [agent skills standard](https://agentskills.io/). - [Docs MCP Server](https://databricks.com/devhub/docs/tools/ai-tools/docs-mcp-server.md): The DevHub Docs MCP Server gives coding agents and IDE assistants read access to all Databricks developer documentation on dev.databricks.com. Agents can discover available pages and fetch individual docs as markdown without leaving the editor. ## Templates Opinionated, copy-pasteable templates for building on Databricks. Browse the catalog at https://databricks.com/devhub/templates. - [All Templates](https://databricks.com/devhub/templates.md): Browse all templates - [AI Chat App](https://databricks.com/devhub/templates/ai-chat-app.md): Model Serving integration, AI SDK streaming chat, and Lakebase-persisted chat history. - [App with Lakebase](https://databricks.com/devhub/templates/app-with-lakebase.md): Wire up a Databricks App with Lakebase for persistent data storage. Includes schema setup and full CRUD API routes. - [Genie Analytics App](https://databricks.com/devhub/templates/genie-analytics-app.md): Build a minimal Databricks App with AI/BI Genie conversational analytics. Covers Genie space configuration, plugin wiring, and deploy. - [Lakebase Off-Platform](https://databricks.com/devhub/templates/lakebase-off-platform.md): Use Lakebase from apps hosted outside Databricks App Platform (for example on AWS, Vercel, or Netlify) with portable env, token, and Drizzle patterns. - [Operational Data Analytics](https://databricks.com/devhub/templates/operational-data-analytics.md): End-to-end setup for analyzing operational database data in the lakehouse: Unity Catalog with external storage, Lakebase provisioning, Lakehouse Sync CDC replication, and a medallion architecture pipeline with silver and gold layers. - [Set Up Your Local Dev Environment](https://databricks.com/devhub/templates/set-up-your-local-dev-environment.md): Install the Databricks CLI, authenticate a profile, and verify the handshake. The strict prerequisite for every other DevHub recipe and template. - [Spin Up a Databricks App](https://databricks.com/devhub/templates/spin-up-databricks-app.md): Scaffold a fresh AppKit Databricks App with `databricks apps init`, run it locally, and deploy to your workspace. - [Onboard Your Coding Agent](https://databricks.com/devhub/templates/onboard-your-coding-agent.md): Install Databricks agent skills (project-scoped), wire up the DevHub Docs MCP server, and bootstrap an AGENTS.md so your coding assistant knows this repo's workspace defaults. - [Create a Lakebase Instance](https://databricks.com/devhub/templates/lakebase-create-instance.md): Provision a managed Lakebase Postgres project on Databricks and collect the connection values needed by downstream templates. - [Lakebase Data Persistence](https://databricks.com/devhub/templates/lakebase-data-persistence.md): Add a managed Postgres database to your Databricks app using the Lakebase plugin. Covers schema setup, table creation, and full CRUD REST API routes. - [Lakebase pgvector](https://databricks.com/devhub/templates/lakebase-pgvector.md): Enable vector similarity search in Lakebase using the pgvector extension. Covers extension setup, vector table design, insert and cosine retrieval helpers, and IVFFlat/HNSW index options. - [Query AI Gateway Endpoints](https://databricks.com/devhub/templates/foundation-models-api.md): Query AI Gateway endpoints for production-ready access to foundation models with built-in governance. - [Generate Embeddings with AI Gateway](https://databricks.com/devhub/templates/embeddings-generation.md): Generate text embeddings from a Databricks AI Gateway endpoint using the Databricks SDK. - [Create a Databricks Model Serving endpoint](https://databricks.com/devhub/templates/model-serving-endpoint-creation.md): Create and validate a Databricks Model Serving endpoint for AI chat inference in Databricks Apps. - [Streaming AI Chat with Model Serving](https://databricks.com/devhub/templates/ai-chat-model-serving.md): Build a streaming AI chat experience using AI SDK and Databricks Model Serving endpoints. - [Lakebase Agent Memory](https://databricks.com/devhub/templates/lakebase-agent-memory.md): Persist your AI agent's chat sessions and messages in Lakebase so users can resume conversations and your agent can reason over prior turns across deploys. - [Lakebase Change Data Feed: Sync Lakebase to Unity Catalog (Autoscaling)](https://databricks.com/devhub/templates/lakebase-change-data-feed-autoscaling.md): Replicate Lakebase Autoscaling Postgres tables into Unity Catalog as managed Delta tables using Lakehouse Sync, with CDC and SCD Type 2 history. - [Sync Tables: Unity Catalog to Lakebase (Autoscaling)](https://databricks.com/devhub/templates/sync-tables-autoscaling.md): Sync Unity Catalog tables into Lakebase Autoscaling Postgres as synced tables for sub-10ms application queries, with snapshot, triggered, or continuous modes. - [Set Up Unity Catalog with External Storage](https://databricks.com/devhub/templates/unity-catalog-setup.md): Create a Unity Catalog catalog backed by an external S3 bucket with storage credentials, external location, and a schema ready for lakehouse tables. - [Genie Conversational Analytics](https://databricks.com/devhub/templates/genie-conversational-analytics.md): Embed a Databricks AI/BI Genie chat interface so users can explore data through natural language. Configure a Genie space, wire up server and client plugins, declare app resources, and deploy. - [Genie Multi-Space Selector](https://databricks.com/devhub/templates/genie-multi-space.md): Add a space selector so users can switch between multiple AI/BI Genie spaces from a single page. Covers multi-alias server config, per-space bundle resources, and automatic conversation cleanup on space switch and redeployment. - [Medallion Architecture from CDC History Tables](https://databricks.com/devhub/templates/medallion-architecture-from-cdc.md): Transform Lakehouse Sync CDC history tables into a medallion architecture with silver (current state) and gold (aggregations) layers using Lakeflow Declarative Pipelines. - [Lakebase Env Management for Off-Platform Apps](https://databricks.com/devhub/templates/lakebase-off-platform-env-management.md): Define and validate cross-platform environment variables for Lakebase-backed apps deployed outside Databricks App Platform. - [Lakebase Token Management](https://databricks.com/devhub/templates/lakebase-token-management.md): Implement cached workspace and Lakebase credential token flows for secure Postgres access in off-platform deployments. - [Drizzle + Lakebase in an Off-Platform App](https://databricks.com/devhub/templates/lakebase-drizzle-off-platform.md): Connect Drizzle ORM to Lakebase with pg password callbacks and migration-time temporary DATABASE_URL credentials. - [Volume File Manager](https://databricks.com/devhub/templates/volume-file-upload.md): Add file upload, browsing, download, delete, file type validation, and CSV row preview to your Databricks app using Unity Catalog Volumes. - [Agentic Support Console](https://databricks.com/devhub/templates/agentic-support-console.md): End-to-end AI-powered support console combining Lakebase, Lakehouse Sync, a medallion pipeline, an LLM agent job, reverse sync, and a Databricks App with Genie analytics. - [Vacation Rentals Operations Console](https://databricks.com/devhub/templates/vacation-rentals.md): Vacation rental ops dashboard with revenue analytics from a SQL Warehouse, a booking queue with Lakebase-backed flags and agent notes, and an embedded Genie chat panel. - [SaaS Subscription Tracker](https://databricks.com/devhub/templates/saas-tracker.md): Internal tool for tracking team SaaS subscriptions, owners, costs, and renewals with Lakebase persistence and Genie spend analytics. - [Content Moderator](https://databricks.com/devhub/templates/content-moderator.md): Internal content moderation tool with per-channel guidelines, AI-powered compliance scoring via Model Serving, and a moderator review workflow backed by Lakebase and Genie analytics. - [Inventory Intelligence](https://databricks.com/devhub/templates/inventory-intelligence.md): Retail inventory management with AI-powered demand forecasting, replenishment recommendations, and optional Genie analytics. Built on a live medallion pipeline synced to Lakebase. - [RAG Chat App](https://databricks.com/devhub/templates/rag-chat.md): Streaming Retrieval-Augmented Generation chat app with pgvector retrieval from Lakebase, Wikipedia seed corpus, Model Serving generation, and Lakebase-backed chat history. Consumed via `databricks apps init`. ## Solutions Databricks use-case solutions built on Lakebase, Agent Bricks, and Databricks Apps. - [All Solutions](https://databricks.com/devhub/solutions.md): Overview of Databricks developer solutions - [Introducing dev.databricks.com](https://databricks.com/devhub/solutions/devhub-launch.md): A new developer hub for building on Databricks: opinionated, copy-pasteable templates and agent-ready documentation for software engineers. - [How to Build Production-Ready Data and AI Apps with Databricks Apps and Lakebase](https://www.databricks.com/blog/how-build-production-ready-data-and-ai-apps-databricks-apps-and-lakebase): Build full-stack data apps on Databricks Apps with Lakebase synced tables that replicate Unity Catalog data in seconds, and ship everything as code with Databricks Asset Bundles. (Databricks Blog) - [Ship quality enterprise AI agents to business users with Agent Bricks and Databricks Apps](https://www.databricks.com/blog/ship-quality-enterprise-ai-agents-business-users-agent-bricks-and-databricks-apps): Build domain-specific AI agents with Agent Bricks, deploy them through a chat UI on Databricks Apps, and distribute them to business users via Databricks One. (Databricks Blog) - [How to use Lakebase as a transactional data layer for Databricks Apps](https://www.databricks.com/blog/how-use-lakebase-transactional-data-layer-databricks-apps): Walk through a holiday request app that uses Lakebase as the operational Postgres tier behind Databricks Apps, from database setup to a fully connected frontend. (Databricks Blog) - [Database Branching in Postgres: Git-Style Workflows with Databricks Lakebase](https://www.databricks.com/blog/database-branching-postgres-git-style-workflows-databricks-lakebase): Use Lakebase copy-on-write branches to give every developer, pull request, and CI run an isolated Postgres environment, and power instant point-in-time recovery and ephemeral databases for AI agents. (Databricks Blog)