The Future of AI: Build Agents That Work
Type
On-Demand Video
Duration
22 minutes 53 seconds
Related Links
Sam Altman and Ali Ghodsi: OpenAI + Databricks, AI Agents in the Enterprise, The future of GPT-OSS
Watch Sam Altman and Ali Ghodsi—founders of OpenAI and Databricks respectively—dive deep into what's next for AI agents in business. From OpenAI's research roots in 2015 to Databricks' Spark origins in 2013, these leaders share unique insights on enterprise AI.
Hosted by Databricks' Neural Networks CTO, Hanlin Tang.
Key Takeaways
Ali and Sam discuss the strategic partnership bringing OpenAI's most powerful models natively into Databricks, enabling enterprises to use cutting-edge AI on their proprietary data with full governance, security, and privacy controls.
Why This Partnership Matters
Ali Ghodsi:
"Every one of our enterprise customers wants to use OpenAI models on their enterprise data. Getting these two things working together is nontrivial because the data is sensitive—they need privacy, auditing, GDPR rights—but they also want to use the models to build agents and get insights. Customer demand has been overwhelming."
Sam Altman:
"Enterprise is becoming one of our biggest focuses. We've had 6x enterprise growth this year. We're heading into the phase of AI where models are getting so good that enterprises will need to use them, want to use them, and bring them into their whole ecosystem. We cannot imagine a better partner than Databricks to make that happen."
Key Discussion Topics
The Enterprise Context Advantage
- Consumer AI started with public data accumulated over thousands of years
- Enterprise AI unlocks proprietary data not available to LLMs
- Providing enterprise context to agents is the big unlock for the next wave of AI
Model Capability Evolution: The 50% Task Horizon Metric
Sam Altman introduces a powerful framework for thinking about AI progress:
"For a particular class of task, how long of a task does the model have a 50% chance of success at?"
The Evolution:
- GPT-3.5 Launch: 5-second coding tasks
- GPT-4 Iterations: 5-minute coding tasks
- GPT-5: 5-hour coding tasks
"But a lot of what an enterprise does requires tasks that take months or years. Lengthening the horizon that these models can work on, giving them all the context that exists inside an enterprise, and expanding this to many more verticals—that will be the important thrust." — Sam Altman
Context Management: The Key to Longer Horizons
Ali Ghodsi emphasizes that more enterprise context directly extends task horizons:
"If you bring more context with the data enterprises have—which is proprietary—you can increase that horizon length. One of the things we developed called JAPA automatically optimizes that enterprise context so you can feed the relevant context from different docs inside the enterprise without humans sitting there manually optimizing it."
The Future of Open Source Models
Sam on Open-Weight Models:
- "There's clearly demand for models that users control and can run on their own systems"
- "Much less demand than for the most capable cloud-hosted models, but people who want it really want it"
- OpenAI is working toward someday running GPT-5-quality models in 120B parameter open-source form factors
- Vision: New computers for the AI era that can run great models locally when wifi is down or for privacy needs
"Privacy and freedom will be two extremely important principles for how people use AI. If it becomes as important as we expect, people will want good local models." — Sam Altman
Governance as the Limiting Factor
Sam Altman:
"This is going to become the fundamental limiter for adoption of AI in the enterprise. It won't be about intelligence, it won't be about price—the research teams will figure that out. But enterprises are starting to realize just how critical governance is, and it will be the limiting reagent."
Ali Ghodsi:
"We've worked very closely to build with privacy and security from the ground up—audit logging on everything, access control, on-brand outputs, competitor filtering. If you use GPT-5 now inside Databricks, you get all of that out of the box."
The Future: AI Coworkers Across Every Function
Ali's Vision:
- Coding is only 20% of what an engineer does at Databricks
- The other 80%: design docs, PRDs, discussions, meetings—all have potential for agent involvement
- End-to-end agent involvement across the entire workflow will make the coding itself much better by providing missing context
- Every function—sales engineering, marketing ops, financial analysis—will be completely transformed
Most Exciting Use Cases Today:
- AstraZeneca: Sifting through 400,000 documents doing what no human could do
- Financial Services: Analyzing SEC filings and related documents to glean alpha for investments
- Insurance: Underwriting automation
- Healthcare: Processing thousands of pages per hospital visit for risk extraction

