Loading...
Lessons from the front lines
Read insights from thought leaders and success stories from leading organizations.
Adventures in data democracy
Since its founding 144 years ago, AT&T has reinvented itself many times to harness historic disruptive innovations such as the transistor, the communication satellite, and more recently, the solar cell.Adventures in data democracy
Since its founding 144 years ago, AT&T has reinvented itself many times to harness historic disruptive innovations such as the transistor, the communication satellite, and more recently, the solar cell.
16 Today, the global technology, media, and telecommunications giant is reinventing itself again—this time
as a pioneer in the use of ML, which it is deploying broadly in areas such as IoT, entertainment, and customer care.17
The company is also leveraging ML to reimagine the way it finds, organizes, and uses data. “One of the things we wanted to do was automate some of the routine cleansing and aggregation tasks that
data scientists have to perform so they could focus on more sophisticated work,” says Kate Hopkins, vice president of data platforms, AT&T Chief Data Office.18 Likewise, the company wanted to develop a way to democratize meaningful data, to the extent consistent without privacy, security, and other data use policies, making it more broadly available to qualified
personnel across the enterprise. These efforts, Hopkins says, have already borne fruit. New tools have shrunk the time to market required to go from prototype to full scale production for ML models. These
models have had dramatic results, such as blocking 6.5 billion robocalls to customers, deterring fraud in AT&T stores, and making technicians visits to customer homes more efficient.
AT&T started its data transformation journey in 2013 when it began aggregating large volumes of customer and operational data in data lakes. In 2017, the company created a chief data office with
the goal of leveraging these rapidly growing data stores for “hyper-automation, artificial intelligence, and machine learning.” The ongoing work of achieving these goals has presented several significant
challenges. First, in a company as large as AT&T, it was sometimes difficult to find and access potentially valuable data residing in legacy systems and databases. And even when data scientists eventually
found such data, they occasionally struggled to understand it, since it was often labeled inconsistently and offered no discernable context or meaning. Finally, there was a formidable latency challenge across
all data systems that, left unaddressed, would stymie the real-time data needs of ML models.
To address these challenges, the chief data office developed the Amp platform. Amp enables a culture of technology and data-sharing, reusability, and extensibility at AT&T. Pari Pandya, director
of technology and project manager for Amp, says that what began a few years ago as an internal online marketplace (aggregating microservices, APIs, chat bots, designs, etc.) for accelerating automation,
has evolved into a single, powerful source of data truth for systems and users. Consider this: As data flows through multiple systems and processes, its definitions change. Amp not only finds legacy system
data, it uses metadata to ascribe meaning to this data, and provides a clear lineage to help users better understand the data. “It serves as a business intelligence platform that provides not only meaningful
data but analytic and visualization tools that empower business teams, strategists, and product developers to leverage data in more advanced ways and share insights through data communities,” Pandya says.
19
To meet the challenge of latency, AT&T is on a multiyear journey to move some of its data and tools to the public cloud. Working closely with cyber teams to ensure data and IP security, the company
is leveraging the cloud’s ability to scale up compute power as needed. The cloud’s power is helping create the real-time access that ML—as well as enterprise stakeholders and customers—require. Unlimited
access to compute on demand through the cloud and the availability of business-ready data is accelerating the journey.
Hopkins notes that AT&T’s data transformation journey has yielded another welcome
benefit. “The business units have become much more knowledgeable about data science and are identifying opportunities to use data in new ways. Across the board they’re requesting much more mature and sophisticated
data,” she says, adding that “being able to democratize data and make the process transparent across the enterprise can deliver exponential payback.”
Data and IT double-team digital transformation
How can a 100-year-old retail organization efficiently and accurately take data from legacy applications that were designed for very specific use cases to accomplish something that those applications were never intended to do?Data and IT double-team digital transformation
How can a 100-year-old retail organization efficiently and accurately take data from legacy applications that were designed for very specific use cases to accomplish something that those applications were never intended to do?
“Every legacy company faces this challenge,” says Paul Ballew, chief data and analytics officer at the leading Canadian food and pharmacy retailer Loblaw.20 “You have to bring those data assets together from across your ecosystem in a way that’s scalable, repeatable, and governable, which is no small task.”
Taking an ecosystem approach to data is particularly formidable in a successful retail organization such as Loblaw, which operates 2,400 stores and maintains an expansive e-commerce presence. “We
are a legacy company trying to leverage technologies that digital natives are born with,” Ballew says.
Yet despite its challenges, data represents a unique opportunity on the path to Loblaw’s digital future. And unique opportunities require unique approaches. Like many digital nonnatives, the company
is shifting its focus from traditional data management priorities such as storage, curation, and quality to a new, more complex arena in which data analytics and digital solutions drive day-to-day operations.
“It requires a different approach to ‘baking the soufflé,’” Ballew says. “We source and mix ingredients differently, and then serve it in new ways to those consuming it.”
Recognizing the critical importance of data in the company’s digital future, Loblaw set up a distinct data organization that works in tandem with IT to drive digital transformation and engage the
business.
From a technology standpoint, Loblaw takes a three-layered approach:
Banking on distributed data architecture
ABN AMRO is taking a modern approach to data management.Banking on distributed data architecture
ABN AMRO is taking a modern approach to data management. Rather than engineering endless workarounds to accommodate problems with the data pulsing through its systems, the Netherlands-based global bank has developed a feedback mechanism that enables data
scientists to request data quality issues be fixed at the source and focus on turning data into value. “In the past, data scientists would find a problem, fix it, and keep going,” says Santhosh Pillai, chief
architect and data management. “Now they can provide feedback to the source where data is mined, and say, ‘do it differently.’ Over time, data quality improves, and data scientists don’t have to spend as
much time on cleansing and querying.”21
Strengthening governance at the
source is just one component of a three-pronged approach the bank is taking to prepare for what Pillai calls “the AI decade”—an era when AI increasingly augments or even replaces human decision-making. The
second component focuses on the consumption side, where ABN AMRO has engineered an advanced analytics and AI layer to support business strategies that are evolving rapidly. “In an increasingly digital world,
being client-centric means being data-centric,” Pillai says. “Particularly in the post-COVID era, companies can’t meet face-to-face with clients, so they rely more heavily on data and analytic insights.
The analytics capabilities we have in place deliver these insights and unleash the value contained within our data.”
The third component of ABN AMRO’s data transformation effort is a multifaceted
data mesh model that moves data anywhere it needs to go within the ecosystem, from source all the way to consumer. This “data supply chain” serves not only as a distribution mechanism but as a timing guarantee
mechanism that enables real-time access to meet demand. It also features a self-service “marketplace” where consumers of data—both human and machine—can access high-quality data that is usage-approved and
regulatorily compliant.
Like many established organizations, ABN AMRO didn’t originally design its data architecture to be event-driven—or for current data usage patterns. Today, algorithms and end
users read up-to-the-minute data far more frequently than they use it in transactions. Legacy data management models were not designed to respond to constant read queries and real-time updates.
“We
solved this challenge by putting each original record in a data store and replicating it,” Pillai says. “On the consumer end, users see replicated data delivered with minimal latency and think they are seeing
real-time data generated at the point of consumption. In fact, that data they are reading is coming from another part of the ecosystem.”
Pillai sees great potential in this data replication model,
particularly in the area of cloud storage. “Traditionally, technology was designed to optimize data storage. But as we approach the AI decade, I expect to see more companies develop mechanisms for replicating
data that is stored in several clouds and even moving that data between multiple cloud vendors.”
Lessons from the front lines
Read insights from thought leaders and success stories from leading organizations.
Adventures in data democracy
Since its founding 144 years ago, AT&T has reinvented itself many times to harness historic disruptive innovations such as the transistor, the communication satellite, and more recently, the solar cell.Adventures in data democracy
Since its founding 144 years ago, AT&T has reinvented itself many times to harness historic disruptive innovations such as the transistor, the communication satellite, and more recently, the solar cell.
16 Today, the global technology, media, and telecommunications giant is reinventing itself again—this time
as a pioneer in the use of ML, which it is deploying broadly in areas such as IoT, entertainment, and customer care.17
The company is also leveraging ML to reimagine the way it finds, organizes, and uses data. “One of the things we wanted to do was automate some of the routine cleansing and aggregation tasks that
data scientists have to perform so they could focus on more sophisticated work,” says Kate Hopkins, vice president of data platforms, AT&T Chief Data Office.18 Likewise, the company wanted to develop a way to democratize meaningful data, to the extent consistent without privacy, security, and other data use policies, making it more broadly available to qualified
personnel across the enterprise. These efforts, Hopkins says, have already borne fruit. New tools have shrunk the time to market required to go from prototype to full scale production for ML models. These
models have had dramatic results, such as blocking 6.5 billion robocalls to customers, deterring fraud in AT&T stores, and making technicians visits to customer homes more efficient.
AT&T started its data transformation journey in 2013 when it began aggregating large volumes of customer and operational data in data lakes. In 2017, the company created a chief data office with
the goal of leveraging these rapidly growing data stores for “hyper-automation, artificial intelligence, and machine learning.” The ongoing work of achieving these goals has presented several significant
challenges. First, in a company as large as AT&T, it was sometimes difficult to find and access potentially valuable data residing in legacy systems and databases. And even when data scientists eventually
found such data, they occasionally struggled to understand it, since it was often labeled inconsistently and offered no discernable context or meaning. Finally, there was a formidable latency challenge across
all data systems that, left unaddressed, would stymie the real-time data needs of ML models.
To address these challenges, the chief data office developed the Amp platform. Amp enables a culture of technology and data-sharing, reusability, and extensibility at AT&T. Pari Pandya, director
of technology and project manager for Amp, says that what began a few years ago as an internal online marketplace (aggregating microservices, APIs, chat bots, designs, etc.) for accelerating automation,
has evolved into a single, powerful source of data truth for systems and users. Consider this: As data flows through multiple systems and processes, its definitions change. Amp not only finds legacy system
data, it uses metadata to ascribe meaning to this data, and provides a clear lineage to help users better understand the data. “It serves as a business intelligence platform that provides not only meaningful
data but analytic and visualization tools that empower business teams, strategists, and product developers to leverage data in more advanced ways and share insights through data communities,” Pandya says.
19
To meet the challenge of latency, AT&T is on a multiyear journey to move some of its data and tools to the public cloud. Working closely with cyber teams to ensure data and IP security, the company
is leveraging the cloud’s ability to scale up compute power as needed. The cloud’s power is helping create the real-time access that ML—as well as enterprise stakeholders and customers—require. Unlimited
access to compute on demand through the cloud and the availability of business-ready data is accelerating the journey.
Hopkins notes that AT&T’s data transformation journey has yielded another welcome
benefit. “The business units have become much more knowledgeable about data science and are identifying opportunities to use data in new ways. Across the board they’re requesting much more mature and sophisticated
data,” she says, adding that “being able to democratize data and make the process transparent across the enterprise can deliver exponential payback.”
Data and IT double-team digital transformation
How can a 100-year-old retail organization efficiently and accurately take data from legacy applications that were designed for very specific use cases to accomplish something that those applications were never intended to do?Data and IT double-team digital transformation
How can a 100-year-old retail organization efficiently and accurately take data from legacy applications that were designed for very specific use cases to accomplish something that those applications were never intended to do?
“Every legacy company faces this challenge,” says Paul Ballew, chief data and analytics officer at the leading Canadian food and pharmacy retailer Loblaw.20 “You have to bring those data assets together from across your ecosystem in a way that’s scalable, repeatable, and governable, which is no small task.”
Taking an ecosystem approach to data is particularly formidable in a successful retail organization such as Loblaw, which operates 2,400 stores and maintains an expansive e-commerce presence. “We
are a legacy company trying to leverage technologies that digital natives are born with,” Ballew says.
Yet despite its challenges, data represents a unique opportunity on the path to Loblaw’s digital future. And unique opportunities require unique approaches. Like many digital nonnatives, the company
is shifting its focus from traditional data management priorities such as storage, curation, and quality to a new, more complex arena in which data analytics and digital solutions drive day-to-day operations.
“It requires a different approach to ‘baking the soufflé,’” Ballew says. “We source and mix ingredients differently, and then serve it in new ways to those consuming it.”
Recognizing the critical importance of data in the company’s digital future, Loblaw set up a distinct data organization that works in tandem with IT to drive digital transformation and engage the
business.
From a technology standpoint, Loblaw takes a three-layered approach:
Banking on distributed data architecture
ABN AMRO is taking a modern approach to data management.Banking on distributed data architecture
ABN AMRO is taking a modern approach to data management. Rather than engineering endless workarounds to accommodate problems with the data pulsing through its systems, the Netherlands-based global bank has developed a feedback mechanism that enables data
scientists to request data quality issues be fixed at the source and focus on turning data into value. “In the past, data scientists would find a problem, fix it, and keep going,” says Santhosh Pillai, chief
architect and data management. “Now they can provide feedback to the source where data is mined, and say, ‘do it differently.’ Over time, data quality improves, and data scientists don’t have to spend as
much time on cleansing and querying.”21
Strengthening governance at the
source is just one component of a three-pronged approach the bank is taking to prepare for what Pillai calls “the AI decade”—an era when AI increasingly augments or even replaces human decision-making. The
second component focuses on the consumption side, where ABN AMRO has engineered an advanced analytics and AI layer to support business strategies that are evolving rapidly. “In an increasingly digital world,
being client-centric means being data-centric,” Pillai says. “Particularly in the post-COVID era, companies can’t meet face-to-face with clients, so they rely more heavily on data and analytic insights.
The analytics capabilities we have in place deliver these insights and unleash the value contained within our data.”
The third component of ABN AMRO’s data transformation effort is a multifaceted
data mesh model that moves data anywhere it needs to go within the ecosystem, from source all the way to consumer. This “data supply chain” serves not only as a distribution mechanism but as a timing guarantee
mechanism that enables real-time access to meet demand. It also features a self-service “marketplace” where consumers of data—both human and machine—can access high-quality data that is usage-approved and
regulatorily compliant.
Like many established organizations, ABN AMRO didn’t originally design its data architecture to be event-driven—or for current data usage patterns. Today, algorithms and end
users read up-to-the-minute data far more frequently than they use it in transactions. Legacy data management models were not designed to respond to constant read queries and real-time updates.
“We
solved this challenge by putting each original record in a data store and replicating it,” Pillai says. “On the consumer end, users see replicated data delivered with minimal latency and think they are seeing
real-time data generated at the point of consumption. In fact, that data they are reading is coming from another part of the ecosystem.”
Pillai sees great potential in this data replication model,
particularly in the area of cloud storage. “Traditionally, technology was designed to optimize data storage. But as we approach the AI decade, I expect to see more companies develop mechanisms for replicating
data that is stored in several clouds and even moving that data between multiple cloud vendors.”
Learn more
Download the trend to explore more insights, including the “Executive perspectives” where we illuminate the strategy, finance, and risk implications of each trend, and find thought-provoking “Are you ready?” questions to navigate the future boldly. And check out these links for related content on this trend:
Next Trend:
Senior contributors
Edward Bowen, Carl Gerber, Adarsh Gosu, Jason Price, Piyush Sacheti, and Matt Iames
Endnotes