For years, tech estate modernization has been a goal of many C-suite leaders across industries to reduce technical debt and enable new business capabilities. Given legacy tech constraints, incremental change has often been a de facto approach: systematically retiring applications, pivoting to modern delivery methods like DevSecOps (development, security, and operations) and migrating workloads to the cloud. Now, chief information officers may be asking: Can artificial intelligence help us accelerate it?
AI, engineering, and infrastructure capabilities are advancing so dramatically that an incremental approach to tech transformation is likely no longer the only way to modernize.
At this stage, three approaches appear to be emerging for tech estate modernization with AI:
Each is distinct and builds on the prior in terms of business and technology strategy ambition.
Some organizations are leveraging gen AI and AI agents to rethink tech processes across the information technology function. AI coding tools can transform how applications are developed and delivered:1 reducing operational cost and speeding capability delivery. But there is also a new frontier: utilizing gen AI to increase the efficiency of current processes and tools.
A company might choose this path if there are many federated or distributed systems that can be enabled by application program interfaces to feed into an intelligent dashboard layer or workflow. Gen AI can help structure the unstructured and create a clearer picture from different data sources. This can help leaders take advantage of a new level of sophistication in modernizing what may have been previously unmodernizable with broader strategies.
For example, an IT chief data officer at an oil and gas company interviewed for Deloitte’s fourth quarter 2024 State of Generative AI study managed 1,000 pieces of equipment across 70 to 80 sites.2 Every asset had a control or a command center team, making the predictive maintenance dashboard very overwhelming. To help address the complexity, the team used gen AI to build a query tool on top of the dashboard to summarize any major concerns. Given the global scale of its equipment supply chain, the company started with a six-month proof of concept. Currently, the gen AI overlay is helping the company stay ahead of issues across 20 pieces of oil and gas machinery, where unplanned downtime could result in a US$10 billion to US$20 billion loss.3
CIOs choosing this path should consider whether to develop agents using a packaged solution or create a custom build, and consider the impacts across the value chain, including the tech workforce. This decision should be based on the required customization level, data control, and cost of packages and platforms. We expect to see more focus on customization, where it relates to actual products versus back-office processes, given strong opportunities to drive business intelligence.
A bolder step draws lessons from the last major core IT transformation: the move to the cloud. During that transition, many enterprises took a lift/shift approach, migrating legacy applications without refactoring or rewriting their code, which may have led to lower return on investment. The real benefit may have been delayed by waiting to rewrite and refactor applications to more open architectures until after migration. While that approach often brought more flexibility and cost efficiencies than low-code or no-code packages, it likely also contributed to higher costs and unnecessary pain downstream.
Some CIOs are taking a bolder approach to reengineering their digital core with more intelligent AI-enabled technologies. This type of modernization approach covers modern architecture, containers, and the necessary integrations while leveraging AI to help make it as affordable as possible.
The engineering and AI approach is grounded in the belief that simply recoding a poorly written application in a new programming language without redesigning the solution may be too expensive and warrants action now.
Another issue is data transformation. Reengineering applications and platforms might be a CIO’s best chance to move and modernize data to make it easy for AI and enterprise systems to consume. This new data paradigm should account for distributed ownership, unstructured inputs, high-speed ingestion, and scalable inferencing. However, many organizations are not equipped for this level of data transformation.
In terms of infrastructure transformation, AI is expected to strain enterprises’ existing computing infrastructure and introduce increased energy demand from inferencing, without computing environments undergoing some reimagination. CIOs should look to get ahead of this issue by testing and learning in the cloud. They can use these insights to help redesign a more modern hybrid infrastructure, including private AI infrastructure alternatives.
What could this approach look like in practice?
The head of finance for one investment management company interviewed for Deloitte’s fourth quarter 2024 State of Generative AI research explained their journey with generative AI.4 Their hyper-personalized portfolio management, or AI investment management solution, works by leveraging master data from the company’s almost 100 million retail clients. The data includes clients’ personal life-stage preferences and risk-return tolerances. This allows the company to dynamically change a client’s portfolio in real time, benchmarking bespoke portfolios against the S&P 500. Though in the early stages, the AI solution is now more competitive compared to robo-advisory challengers—and more efficient. It enables a portfolio manager who previously managed only one or two portfolios to now handle seven or eight. The return on investment also equates to a longer-term transformation in the operating model and the number of basis points it takes to originate a portfolio.5
Human and AI development teams are changing software development. Amazon uses its gen AI assistant, Amazon Q, to upgrade applications to Java 17. On average, the organization is able to reduce the time to upgrade an application from 50 developer days to a few hours. These upgrades have also enhanced security and reduced infrastructure costs, resulting in an estimated US$260 million in annualized efficiency gains.6
Additionally, Airbnb completed its first large-scale, large language model–driven code migration with the help of frontier models and robust automation. The company was able to update around 3,500 React component test files from Enzyme to a React Testing Library in six weeks, compared with an earlier estimate of manually updating them in 1.5 years.7 Goldman Sachs has used auto-coding to improve developer proficiency by 20%.8 Conceptually, these agents act as team members and an embedded support system that can code, research, and automate processes alongside human teams.
For some organizations, modernizing the tech estate can be an existential exercise. Rather than automating existing processes with AI, they might reimagine processes—and then incorporate AI. These enterprises have amplified the impact with AI by digitally transforming the front and back office.
Change of this magnitude will not happen overnight. To help drive this, CIOs can work with the business to identify business processes that are ripe for change, such as converting a lead to cash or an order, pricing configuration, or order submission. These can be reimagined function by function, translating learnings into a new enterprise core in a hub-and-spoke model.
Some enterprises are just beginning to experiment in this space; for example, using large language models to respond to level one and level two customer calls with minimal human assistance or employing digital twins to simulate the potential impacts of pulling the plug on a legacy system.
One pharma company interviewed for Deloitte’s fourth quarter 2024 State of Generative AI in the Enterprise research started by asking gen AI to recommend relevant molecules for drug discovery that fit a specified set of criteria.9 Gen AI invented nonexistent molecules. To address the issue, the company adjusted its strategy. First, the medical chemist would select a real molecule. Then, gen AI would modify the molecule based on the prompting criteria. A controlled environment with less room for ambiguity made all the difference. Gen AI was able to propose potential structural modifications that could be prioritized by optimization algorithms and predictive modeling to identify the most viable, stable, and likely-to-succeed molecules for in-lab testing.10
Business operational changes may be significant, or small, but even minor changes enabled by AI and reasoning capabilities can create an impact across interconnected business systems.
While the three approaches we have detailed are all different and valid, leaders can determine which one is right for them by considering the components in the tech estate and answering the following questions:
The CIOs who are well-prepared for the future will likely be those who can mobilize to move at the speed of both technology innovation and business need. This could include a dramatically different operating strategy, and a transformation (not modernization) mindset, driven by speed to market, product differentiation, and integrity—all factors that ultimately contribute to shareholder value.