Session

Enterprise AI at Scale: Serving Google Gemini on Databricks for Production Workloads

Overview

ExperienceIn Person
TrackArtificial Intelligence & Agents
IndustryHealthcare & Life Sciences, Manufacturing, Financial Services
TechnologiesDatabricks SQL, Unity Catalog, Lakebase
Skill LevelAdvanced

Foundation models are moving beyond chatbots and copilots into the core of enterprise data pipelines — enriching records, classifying documents, resolving entities, and powering agentic workflows at massive scale. This session shows how Google Gemini and Databricks are partnering to make this real for production enterprises today.

Google Cloud shows how Gemini is available as a first-party model on Databricks, served natively via ai_query() and Model Serving — with built-in governance via AI Gateway, including cost controls, rate limits, usage tracking, and Unity Catalog logging.

HG Insights then brings this to life with a production use case: a multi-stage AI pipeline processing 7 billion records where rules and custom BERT models handle the predictable cases, and Gemini acts as an agentic enrichment layer — autonomously reasoning over incomplete company profiles and resolving missing attributes across hundreds of millions of inference calls. Orchestrated by Lakeflow and monitored via AI/BI dashboards, this pipeline directly generates the revenue-critical datasets HG Insights delivers to Fortune 500 customers.

Whether you're enriching CRM data, building document classification pipelines, or deploying agentic data quality workflows, this session gives you the architecture patterns and lessons learned to run Gemini on Databricks at enterprise scale.

Session Speakers

Speaker placeholderIMAGE COMING SOON

Abhishek Bhagwat

/ML Engineer, Applied AI
Google Cloud

Speaker placeholderIMAGE COMING SOON

Shaielsh Dargude

/AI Fellow
HgInsights