Skip to main content

Introducing OpenAI’s New Open Models on Databricks

Securely build intelligent, domain-specific AI agents with OpenAI gpt-oss

Introducing OpenAI’s New Open Models on Databricks

Published: August 5, 2025

Announcements3 min read

Summary

  • gpt-oss on Databricks: Open-weight 20B and 120B models with advanced reasoning and fast, cost-efficient performance.
  • Secure, domain-ready AI: Build custom, compliant agents that run next to your enterprise data.
  • Scalable and flexible: Automate tasks, fine-tune models, and power real-time apps with 131k context.

We’re excited to be partnering with OpenAI to launch their open weight models gpt-oss 20B and gpt-oss 120B, which are natively available on Databricks today.

The gpt-oss models set a new standard of quality for open models, with support for advanced reasoning and tool use. Open models are more easily customizable to build AI that can reason over your enterprise data and domains, providing a powerful option alongside proprietary models.

gpt-oss joins the growing set of frontier models on Databricks, enabling you to build domain-specific custom AI agents using the best model for your use case while securely leveraging your data with full governance and observability.

What You Can Build with gpt-oss

OpenAI is releasing two models: gpt-oss 20B and 120B, both of which support reasoning and tool use.

These models expand what you can build with GenAI, enabling new use cases and letting you balance speed, cost, and quality. gpt-oss can be used alone or alongside models like GPT-4o, Claude, or Llama. It’s a great fit for:

  • Fast agents
    Built with a Mixture of Experts architecture gpt-oss delivers low-latency performance for use cases like search, chat, and real-time decisioning.
  • Analyzing massive volumes of data
    Summarize and classify millions of documents efficiently by running models directly next to your data and tooling on Databricks.
  • Model customization
    Fine-tune open weights to fit your domain and use case, delivering higher quality for specialized tasks where general-purpose models fall short.
  • Enterprise-grade compliance
    Deploy within your already approved HIPAA-, PCI-compliant, Unity Catalog–governed platform.

   Model Highlights

  • 🧠 Two sizes: 20B and 120B
  • ⚙️ Chain-of-thought reasoning, instruction following, and tool use
  • ⚡Lightning fast latency and best-in-class cost-efficiency due to Mixture of Experts architecture
  • 🧾 131k context for long documents and RAG
  • 📜 Apache 2.0 license

Use gpt-oss on Databricks

Start using gpt-oss right away in Playground with our  Foundation Model API, just like with other supported models. As always, Mosaic AI Gateway automatically governs usage with built-in logging, guardrails, and PII detection, so teams can confidently use gpt-oss in enterprise applications.

Try gpt-oss in AI Playground

Securely Connect gpt-oss to Your Data

Use gpt-oss to build intelligent copilots and agents that can securely access your enterprise data. Retrieve documents, invoke APIs, and power RAG workflows with built-in governance.

Automate Tasks with SQL or Lakeflow

Use gpt-oss in SQL, notebooks, or Lakeflow to automate classification, summarization, and extraction—at scale, with zero setup.

Customize gpt-oss

Open models such as gpt-oss are more easily customizable on your domain-specific data. Fine-tune gpt-oss with your unique enterprise data using Databricks Serverless GPU Compute to optimize the model’s quality for your end use case. Leverage interactive notebooks with built-in MLflow tracking for fast, scalable finetuning experiments and workflows. Contact your Databricks team for early access to this fine-tuning capability. Additionally, enterprises can augment gpt-oss  with structured and unstructured data via Vector Search and feature serving.

Get Started with gpt-oss

The gpt-oss models are available today across AWS, Azure, and GCP in Foundation Model API, with support for Provisioned Throughput and AI Functions rolling out over the next several days.

The gpt-oss models are available today across AWS, Azure, and GCP in Foundation Model API, with support for Provisioned Throughput and AI Functions rolling out over the next several days.

Never miss a Databricks post

Subscribe to the categories you care about and get the latest posts delivered to your inbox