Session
Sponsored by: Capital One Software | How to Manage High-Quality, Secure Data and Cost Visibility for AI
Overview
Experience | In Person |
---|---|
Type | Breakout |
Track | Data and AI Governance |
Industry | Enterprise Technology, Health and Life Sciences, Financial Services |
Technologies | Databricks Workflows, Unity Catalog |
Skill Level | Beginner |
Companies need robust data management capabilities to build and deploy AI. Data needs to be easy to find, understandable, and trustworthy. And it’s even more important to secure data properly from the beginning of its lifecycle, otherwise it can be at risk of exposure during training or inference. Tokenization is a highly efficient method for securing data without compromising performance. In this session, we’ll share tips for managing high-quality, well-protected data at scale that are key for accelerating AI. In addition, we’ll discuss how to integrate visibility and optimization into your compute environment to manage the hidden cost of AI — your data.