Skip to main content
CUSTOMER STORY

Pursuing greater possibilities in the cloud

300%

ROI from migrating one data center to the cloud

3x

Acceleration in data science cycles

3

Months eliminated from process of building out new infrastructure

hero image
CLOUD: Azure
Pursuing greater possibilities in the cloud

“For AT&T, migrating our data platforms to the cloud was about scalability, elasticity, embracing open formats, simplicity and cross-team collaboration. Azure Databricks enabled us to achieve these goals while saving cost.”

— Praveen Vemulapalli, Head of Network Traffic Data Collect and AI Platforms, AT&T, Chief Data Office

AT&T helps millions of U.S. families, friends, neighbors and businesses connect to greater possibility. But until recently, the company’s data architecture prevented decision-makers from connecting to a single source of truth. So AT&T migrated to Azure Databricks to eliminate data silos, reassign compute resources to strategic tasks and reduce the costs and complexity associated with their infrastructure. Today, analytic users across AT&T who work with data access it via the data platform and share their reports, dashboards and models with other teams. The company has achieved significant savings by shutting down data centers and has moved members of their technical staff up the value chain to work on optimizing processes rather than patching operating systems. AT&T’s direction is to connect all their data systems to the lakehouse to enable single-version-of-the-truth data products and democratize data-powered decision-making.

Legacy data architecture drives up costs, hinders decision-making

Many of today’s forward-thinking companies are moving from on-premises Hadoop data lakes to modernized cloud architectures. Few of these companies face the challenges AT&T faced when they planned such a migration in 2020. The company hoped to empower their data teams by democratizing data and scaling up their AI efforts without overburdening DevOps. AT&T also aimed to make better use of insights to improve the customer experience and increase efficiency across the enterprise. But moving a large, complex architecture comes with risks. AT&T’s data platform ecosystem ingested more than 10 petabytes of data per day, managed 100 million petabytes across the network and came with a high cost of ownership. 

“Our primary motivations for our cloud migration were scalability, elasticity, nimbleness, time to market, delivering additional value to our customers and reducing cost,” recalled Mark Holcomb, Distinguished Solution Architect at AT&T. “Our data center was capital-intensive in terms of software costs and maintenance. We had tens of thousands of CPUs on an enormous number of servers, and our IT resources were spending too much time on low-value activities such as patching operating systems. We started looking for a more nimble, scalable and cost-effective solution in the cloud that would enable us to move more of our resources up the value chain so they could focus on optimizing our business and helping us deliver more value to customers.”

Across their legacy architecture, AT&T ran several data management platforms, which locked the efforts of data teams into silos. This architecture made it difficult to access data and led to data duplication and latency issues. As a result, some data metrics often failed to provide a single version of the truth. 

“We had situations where different business leaders would report on the same metric, but each one was presenting slightly different numbers,” Holcomb explained. “For example, customer churn is a very important metric in our business, but it’s difficult to address the problem when each business unit has its own version of the truth. We had different teams analyzing different versions of data and making decisions based on that analysis. It was time to unify everyone on a single, reliable set of business data.”

Modern cloud architecture transforms the data culture across the enterprise

Because AT&T serves more than 70 million postpaid wireless phone subscribers and more than 7.5 million broadband households, the company wanted to ensure business continuity during their migration. AT&T prioritized data privacy, security and governance as they took steps to democratize data. Shortly after the company began evaluating potential data engineering and AI solutions, Azure Databricks emerged as a strong candidate.

“We were looking specifically for a solution that could support all the use cases we had been running on-premises up to that point,” said Praveen Vemulapalli, Head of Network Traffic Data Collect and AI Platforms at AT&T, Chief Data Office. “Azure Databricks came in as a highly mature product that checked all of our boxes. We quickly determined that it could also support us in transforming our huge SAS analytics platform — including data storage, data science and advanced analytics — from on-premises to the cloud. Most importantly, Azure Databricks supported our needs to build everything in a private environment that would satisfy the stringent data security regulations under which we operate.”

Despite the fact that AT&T’s end users had spent many years on the company’s previous data architecture, they embraced Azure Databricks. The company collaborated with Databricks and Microsoft to develop and implement extensive training materials. AT&T then selected change leaders from more than 60 business units and provided them with detailed, step-by-step job aids that would help them complete daily tasks on Azure Databricks. 

This thoughtful strategy paid off. AT&T spent the next nine months moving their workloads to Azure Databricks and established enterprise data assets that replaced siloed data warehouses. 

“Moving to Azure Databricks has transformed the data culture at AT&T,” Vemulapalli reported. “Instead of people analyzing data on their own laptops and saving the results locally, they’re all coming to the cloud to collaborate in one place. They’re sharing data assets with the rest of the company in notebooks, folders and tables and enhancing the overall quality of our analysis.”

Across AT&T, data teams and business users alike use Microsoft Power BI to build business intelligence dashboards that access data through a SQL warehouse in Delta Lake storage. When collaborators want to drill down to a machine learning model underneath a report or dashboard, they can easily access everything they need in the data lake. Azure Databricks has become the platform of choice for AT&T’s data engineers and MLOps engineers. 

“When we migrated to Azure Databricks, we prioritized retiring our most compute-intensive workloads to help us contain the growth of our previous on-premises data lake,” said Vemulapalli. “By the time we had gotten all our data out of our old architecture, we had already retired about 40% of the infrastructure we had been using. This greatly accelerated our ROI from the project.”

Data center migration yields 300x ROI

By migrating to Azure Databricks, AT&T achieved one of their most important goals: data democratization. The company now supports nearly 90,000 internal customers on one data architecture and has reduced their Hadoop and Teradata footprint while eliminating countless data silos. Within this new architecture, AT&T can spin up new computing environments in hours, rather than the three to four months previously required. 

Much of AT&T’s business case for the cloud migration was based on the economic value of shutting down various data centers. When AT&T migrated their Hadoop data lake to Azure Databricks, they achieved a five-year ROI of 300% while reducing more than 80 schemas and streamlining their overall data footprint.  

Just as significantly, AT&T has reduced the amount of compute resources they must devote to maintaining their data products, leaving more capital available for strategic activities such as data science and data engineering. “We used to have to squeeze our data science activities into a relatively small compute bucket,” recalled Holcomb. “That meant we had to serialize our activities, which extended the timeline for testing a hypothesis with different inputs. By using Databricks clusters to do that processing, we’ve accelerated our data science cycles by about 3x.”

As AT&T continues to look for ways to serve their millions of customers more efficiently, the company is converging on a single source of truth. Vemulapalli believes the lakehouse architecture will help make this vision a reality.

“AT&T is completely aligned with the lakehouse strategy for our data architecture,” Vemulapalli remarked. “We just need to keep working to implement it completely across our environment. For example, we’re in the process of migrating our data marts to the lakehouse. Also, Databricks continues to release new services such as Unity Catalog, which we know will be an excellent solution for enhancing our data governance. We look forward to going live on Unity Catalog in the near future.”

AT&T’s innovation won’t stop there. The company plans to continue working closely with Databricks and Microsoft as they implement new services in Azure and complete their transition to the lakehouse. 

“Our partnership with Databricks and Microsoft has been exceptional throughout this project,” Vemulapalli concluded. “But as much as we’ve accomplished together, the journey continues. Every time Databricks releases a new feature, our teams get together to figure out how it will fit into our environment. We continue to work towards having all the applications in our architecture interact directly with a single source of data through the lakehouse. Databricks and Microsoft are helping us get to the future and deliver greater possibilities for our customers.”