Virtual Event
Experience the best of Data + AI World Tour Virtual EMEA

Dive into the latest innovations in data and AI
Couldn’t attend Databricks Data + AI World Tour in Amsterdam, London, Paris, Stockholm, Munich or Milan? No worries — you can now explore all the action anytime, from anywhere.
Tune in to discover:
- The latest innovations within the Databricks Data Intelligence Platform
- Inspiring product demonstrations, feature unveilings and AI advancements
- Real-world success stories shared by customers and industry leaders
- How top organizations are driving transformation through data and AI
Rewatch the sessions and relive the entire experience on your schedule.
Speakers

Samuel Bonamigo
Senior Vice President and General Manager, EMEA
Databricks

David Meyer
Senior Vice President, Product
Databricks

Dennis Michon
Head of Data
easyJet

Maria Zervou
Chief AI Officer, EMEA
Databricks

Naomi Hahn
VP of Data
Skyscanner
Agenda

Samuel Bonamigo
Senior Vice President and General Manager, EMEA
Databricks

David Meyer
Senior Vice President, Product
Databricks
Join an exclusive Q&A where Dennis Michon, Head of Data at easyJet, unpacks how easyJet transformed their revenue management system from outdated legacy tech into a cutting-edge, AI-driven platform in record time.
Discover the challenges of modern airline retail, concrete results from real-time pricing updates and how AI is reshaping commercial decision-making and customer experiences.
This interactive session is your gateway to learning about next-gen data intelligence, the human-AI collaboration journey and the bold moves powering easyJet’s future.

Dennis Michon
Head of Data
easyJet

Maria Zervou
Chief AI Officer, EMEA
Databricks
Discover how Skyscanner, serving 110 million users monthly and processing 35 million daily searches, is leveraging data intelligence to create exceptional travel experiences.
This customer presentation reveals the powerful analytics that optimize both business operations and user journeys, the machine learning models that rank and recommend the perfect travel options and the cutting-edge AI search technology that answers traveler questions to inspire new adventures.
Learn about the company’s strategic investments in their data foundation to strengthen governance, accelerate AI capabilities and successfully land GenAI at scale, transforming travel discovery from simple comparison into intelligent, personalized inspiration.

Naomi Hahn
VP of Data
Skyscanner
Self-guided breakout sessions
Gain access to extra breakout sessions that you can complete at your own pace.
- Whether you’re just getting started with data governance or working to scale it across the enterprise, this session will walk you through how Unity Catalog helps simplify secure access, discovery and collaboration for data and AI
- We’ll start with the fundamentals of Unity Catalog and explain how it works and why it plays a central role in governing data and AI across clouds, teams and workloads. Then we’ll dive into the latest product updates, including live demos that show new capabilities in access control, lineage, discovery and monitoring. You’ll also get a first look at upcoming features and hear insights from customers already running Unity Catalog in production.
- The session wraps up with clear guidance on how to design and roll out Unity Catalog effectively. We’ll cover how to structure catalogs, manage permissions and build for multicloud and multiregional environments. If you want a practical understanding of Unity Catalog and how to apply it in your organization, this session is for you.
- More than 80% of AI projects fail. The reason? Customers aren’t ready to put AI in production in areas where the agent can cause reputational or financial harm.
- Introducing Agent Bricks, a new approach to deploying production-ready AI agents. Agent Bricks uses novel research techniques to automatically generate domain-specific synthetic data and task-aware benchmarks. Based on these benchmarks, it automatically optimizes for cost and quality, saving enterprises from the tedious trial-and-error of current approaches. Now, teams can achieve production-level accuracy and cost efficiency right from the start.
- This session will show you how to build and scale real-world AI systems using Agent Bricks. We’ll cover how to design multi-agent systems that combine retrieval, reasoning and generation to answer complex questions grounded in your enterprise data. Along the way, you’ll see how to track experiments, fine-tune performance and build workflows that are reliable, governed and built for scale.
- Data engineering doesn’t have to be a patchwork of tools and handoffs. In this session, we’ll introduce you to Lakeflow, the Databricks unified solution for building reliable, scalable data pipelines with less friction and more control. Whether you’re just getting started or managing complex workflows, Lakeflow brings together ingestion, transformation and orchestration into one cohesive experience.
- We’ll walk through the key components, including Lakeflow Connect, Lakeflow Spark Declarative Pipelines, Lakeflow Jobs and the new Lakeflow Designer — a visual interface that makes it even easier to build and manage pipelines with minimal code. You’ll see live demos of no-code ingestion, code-optional transformation and unified orchestration across your data estate.
- We’ll also share what’s coming next, including support for open source tooling and expanded no-code capabilities. You’ll leave with a clear understanding of how Lakeflow simplifies your stack, increases productivity and provides a strong foundation for building high-performance, governed data pipelines at scale.
- Lakebase is a new Postgres-compatible OLTP database designed to support intelligent applications. Lakebase eliminates custom ETL pipelines with built-in lakehouse table synchronization, supports sub-10ms latency for high-throughput workloads and offers full Postgres compatibility, so you can build applications more quickly.
- In this session, you’ll learn how Lakebase enables faster development, production-level concurrency and simpler operations for data engineers and application developers building modern, data-driven applications. We’ll walk through key capabilities, example use cases and how Lakebase simplifies infrastructure while unlocking new possibilities for AI and analytics.
- New to Databricks SQL or just trying to keep up with everything that’s been added? This session gives you both. We’ll start with a fast-paced introduction to Databricks SQL, including how to set up a warehouse, load data, run queries and build dashboards in just a few minutes. You’ll get a clear view of how the platform supports analysts, developers, admins and business users with a simple and intuitive interface.
- From there, we’ll dive into the latest features, with live demos and real-world use cases. We’ll cover the new SQL editor, coding enhancements, streaming tables, materialized views, BI integrations, system tables, observability and cost management tools. We’ll also show how AI is being used under the hood to optimize performance and help you get the most out of your workloads. Whether you’re just starting out or looking to stay current, this session will get you up to speed.
- More than 80% of AI projects fail because customers aren’t ready to deploy AI where it could cause reputational or financial harm. This session changes that.
- We’ll start by introducing Agent Bricks, a revolutionary approach that uses novel research techniques to automatically generate domain-specific synthetic data and task-aware benchmarks. Based on these benchmarks, it automatically optimizes for cost and quality, eliminating the tedious trial-and-error of current approaches to deliver production-level accuracy from the start.
- Then we’ll dive deeper, showing you how to build custom AI agents from scratch using Agent Bricks. You’ll learn to design multi-agent systems that combine retrieval, reasoning and generation to answer complex questions grounded in your enterprise data. We’ll cover experiment tracking, performance optimization and building reliable, governed workflows at scale.
- Whether you’re starting your AI journey or scaling existing efforts, this session provides both the out-of-the-box solutions and foundational knowledge to build next-generation GenAI solutions ready for production deployment
- Discover how the latest innovations in Databricks AI/BI Dashboards and Genie are transforming self-service analytics. This session offers a high-level tour of new capabilities that empower business users to ask questions in natural language, generate insights faster and make smarter decisions.
- Whether you’re a long-time Databricks user or just exploring what’s possible with AI/BI, you’ll walk away with a clear understanding of how these tools are evolving — and how to leverage them for greater business impact
- Tired of waiting on SAP data? Join this session to see how Databricks and SAP make it easy to query business-ready data — no ETL. With Databricks SQL, you’ll get instant scale, automatic optimizations and built-in governance across all your enterprise analytics data.
- Fast and AI-powered insights from SAP data are finally possible — and this is how
- Databricks Apps is now generally available, offering a new app hosting platform that brings together everything needed to build production-ready data and AI applications. It gives data and developer teams a way to create custom interfaces on top of the Data Intelligence Platform, making it easier to extend the reach of data and AI across the organization.
- This session will explore common use cases like RAG chat apps, interactive visualizations and custom workflow builders. You’ll learn best practices and design patterns for building apps that are scalable and easy to maintain. The session will wrap with a look at the roadmap and strategy for what’s next with Databricks Apps.
- Databricks is changing how organizations work together by making data sharing more open, secure and flexible. With new capabilities like Clean Rooms for multiparty collaboration, cross-platform view sharing, Sharing for Lakehouse Federation and Databricks Apps in Databricks Marketplace, teams can now share and access data more easily, whether or not they’re on Databricks.
- This session includes live demos of the tools driving this shift. We’ll show Delta Sharing, the open protocol for seamless cross-platform data sharing. We’ll walk through Databricks Marketplace, a central hub for discovering and monetizing data and AI assets. And we’ll dive into Clean Rooms, which enable secure collaboration without exposing raw data. You’ll leave with a clear view of how to use these tools to drive faster insights and better partnerships across your ecosystem.