Skip to main content

Five in 5: Feeder of the Future – What the Grid Could Become

By 2042, grid modernization could lead to decentralized, automated, and integrated power delivery systems.

The distribution grid is evolving quickly. Distributed energy resources (DERs), extreme weather, electrification, cybersecurity threats, and the promise of advanced automation with AI are all converging, creating an increasingly complex utility system.

In our previous installment of Five in 5, “Drivers shaping the distribution grid,” we explored rising grid complexity and stakeholder expectations. In this edition, Deloitte’s power and utilities leaders, Craig Rizzo and Christian Grant, discuss the Feeder of the Future: a vision for the grid in 2042. We zero in on what is possible, highlighting that growing grid complexity and stakeholder expectations are likely to lead to greater operating efficiencies, as well as new capabilities and business outcomes that have previously seemed unattainable. Our next edition will outline the roadmap to get from today to this envisioned tomorrow.

  1. What is your vision for the integrated power delivery system of the future? 

    Craig Rizzo: First, let’s clarify what we mean by “integrated power delivery system.” Historically, we’ve treated generation, transmission, and distribution as distinct domains for planning, operations, and financial clearing and settlements. But as DERs, electrification, and digitalization accelerate, these boundaries will continue to blur. As the industry evolves, our terminology should reflect that shift.

    Our vision is a significantly more integrated set of planning, operations, and market processes and systems across these domains. The distribution grid needs to modernize toward an “Integrated Power Delivery System”: an interconnected, intelligent system capable of optimizing in near real-time across utility assets, third-party DERs, electrified transportation, and smart buildings for cost, reliability, safety, and sustainability.

    Christian Grant: The integrated power delivery system of the future will be more than just a “smarter grid”; it will operate as a federated cyber-physical system, with the distribution feeder at its core. Grid operations will be optimized in layers, from the feeder and substation up to the entire transmission system.

    Our analysis of possible futures points to a highly automated system. Following technology and market trends to their logical conclusion reveals that we can, and will, need to automate most controls, given the complexity and speed of decision cycles.

    On a hot summer afternoon with intermittent cloud cover, humans simply cannot act quickly enough to optimize DERs, including flexible loads, while simultaneously accounting for the settlement, protection, asset conditions, and any number of other variables at the same time. AI, however, appears to be able to do this, enabling the foundational, decentralized layer of the integrated power system. With AI in place, the role of the utility expands from infrastructure provider to grid orchestrator.

    This future also captures a paradigm shift in how we think about planning. We’re moving from periodic, static planning to continuous, scenario-based planning with real-time analytics platforms that support ongoing adaptation. Operations will be automated with the deployment of virtual and physical AI in the grid.

    Imagine a voltage management agent optimizing voltage on a circuit by making physical adjustments. Or, a seemingly extreme possibility is achieving zero human injuries in storm restoration because automated land and air vehicles are able to make most conditions safe, and perhaps even perform some aspects of storm restoration. While these capabilities are within reach, achieving this vision will demand focused attention and clear pathways forward.

    Craig Rizzo: That intelligence must be accompanied by resilience. The grid is exposed to more threats than ever before, from cyberattacks to extreme weather events. The Feeder of the Future has to be designed to withstand those disruptions, such that no single point of failure is a threat to the system, and that failures are addressed effectively and quickly in the normal course of business.

    Today, utilities restore power as quickly as possible after an outage. But by 2042, resilience may mean avoiding disruptions altogether, often without the customer even noticing, with a SAIDI (System Average Interruption Duration Index) measured in seconds rather than minutes or hours.

    Today, utilities with mature FLISR capabilities [Fault Location Isolation and Restoration] can identify a downed line, dynamically isolate the issue to a single block or building, and automatically reroute power. Imagine a future where that capability evolves such that local DERs and microgrids can be automatically orchestrated to eliminate any perceivable outages before they’re reported. Automated safeguards, adaptive grid controls, and analog redundancies will keep critical infrastructure online, even when digital systems are compromised. The integrated power delivery system of the future should be built to endure the pressures we face today and those expected to accelerate in the future.

  2. What are the elements that enable this vision of AI-powered utilities and a modernized grid?

    Christian Grant: We already know that our vision relies on a few challenging requirements. First, the level of automation needed to enable truly decentralized operations at the feeder level, where an AI agent, for example, can open and close a switch with limited human involvement, will require a digital twin.

    This is necessary for two reasons. First, we must be able to train algorithms with synthetic data in addition to real data, so operators can be confident that these agents, or digital workers, can perform under the most severe conditions.

    Second, we need more data to train algorithms. The current distribution grid generates terabytes of data, data that must be harnessed to train AI algorithms effectively. In addition, similar to what we have seen in the automobile industry where we do not want to crash real cars to collect data, we do not want actual outages to be our only data source. Imagine the value of simulating thousands or millions of storms day after day to create automated controls with the equivalent of hundreds of years of experience.

    Finally, infrastructure is needed to enable compute to exist within the grid: the network and communications equipment and systems, the physical and cybersecurity systems, and the actual distribution system hardware. Industry professionals will recognize that this all translates into capital, which is likely to challenge affordability. This is why developing a cost-effective way to get there must be the primary concern. This is not lost on us.

    Craig Rizzo: Christian mentioned the security aspect, and I think that piece cannot be emphasized strongly enough. lntroducing these AI and automation technologies demands a rethink of grid cybersecurity.

    As we push intelligence to the grid edge, we are multiplying the number of access points. Consequently, we are increasing the risk of both physical and cyber threats. Traditional encryption may not be sufficient in a world where AI and even quantum computing could quickly render today’s security protocols obsolete. Utilities will probably need to build advanced encryption capabilities and security safeguards well before quantum computing reaches full scale.

    Now, the goal here is not to panic. It is to prepare. Grid infrastructure takes years to modernize, and we cannot afford to be caught off guard.

    Securing the grid of 2042 will require encryption and threat detection systems capable of identifying and neutralizing threats in real time. Christian mentioned the importance of the digital twin, and real-time security is a key reason for that necessity. A tightly synchronized, high-fidelity digital twin will play a central role in delivering secure automations.

    As cyber and physical attacks grow more sophisticated, utilities will need to stay ahead of them. This requires, for example, training AI models on millions of scenarios so that, as they continuously scan network activity, sensor data, and physical security footage, they are able to detect vulnerabilities, identify where attacks may occur, and proactively secure those weak points, allowing for early detection of coordinated cyber-physical attacks.

    In some cases, AI agents may support utility personnel in coordinating rapid response, from software-level patching to deploying drones or field robotics as part of a broader physical response. In this future, security will be an intelligent, multi-layered, adaptive system that evolves as fast as the threats it faces, not only a static set of defenses.

  3. How will employees and customers interact differently with the Feeder of the Future?

    Christian Grant: We often see our utility clients focusing on technology solutions. But the people, organization, and governance aspects are important to get right. The employee experience will shift as automation and robotics handle tasks that involve multi-variable problems beyond human capabilities, as well as what I mentioned before: risk of injury. This means employees will likely spend time reviewing the results from large language model learning or digital twins, leading to proactive rather than reactive operations.

    The transition to a proactive posture will reduce truck rolls, optimize supply chains, and lead to new levels of O&M cost efficiency, returning to the priority of affordability. Overall, the feeder of the future will help empower employees to engage in safer and more strategic work. Make no mistake though, this will require effective change management.

    Craig Rizzo: Customers will notice a significant shift in how they interact with their utility and manage their energy consumption. This shift could manifest through “local energy communities,” where neighborhoods operate as semi-autonomous nano-grids, optimizing local generation, storage, and load management. Utilities might then facilitate flexible and dynamic pricing structures based on real-time grid conditions, offering customers greater control over their energy costs and reliability preferences. For example, residential communities with high solar penetration could dynamically adapt their energy consumption based on real-time signals, optimizing affordability and resilience simultaneously.

    From the customer’s perspective, the most compelling changes might be those that simplify their energy interactions entirely. Agent-driven platforms could automate energy management decisions, allowing seamless plug-and-play connections for home solar installations, batteries, or EV charging without lengthy approvals or intervention. Imagine a future in which connecting new energy resources takes minutes or hours rather than weeks or months, significantly reducing friction while enhancing grid responsiveness.

    Customers gain more autonomy over their energy decisions, while utilities streamline the integration process, improving overall satisfaction and grid efficiency.

  4. As utilities move towards Feeder of the Future capabilities, where should they be most optimistic, and where should they temper their expectations?

    Craig Rizzo: There’s a lot to be optimistic about. Emerging AI will change utilities and the grid, there’s no question. Utilities can take tangible steps today to start realizing value and get a quick ROI.

    For example, AI-enabled vegetation management and distribution planning are returning near-term value. These use cases can be a good opportunity to leverage cost savings to pay for scaling AI.

    Not starting now could be costly, with delayed learnings, missed cost savings, and a steep learning curve. Data will be key. Getting the right data architecture now is crucial.

    Although immediate value is achievable, utilities should temper expectations about the pace of implementation. Transitioning is our clients’ biggest concern. Concerns about existing business and operating processes, talent, and unproven solutions are all real. This simply means the transition should be deliberate and measured. We think successful AI transformations will balance speed-to-value and early ROI with building the right foundations for long-term scaling.

    Christian Grant: We also need to balance our excitement with realistic expectations about cost and the brownfield aspect of the journey.

    Craig and I have both underscored affordability as a priority. Cost is why developing and maintaining a customized roadmap that makes sense for each utility’s starting point is important. To think that each utility is going to finance the journey to achieve this vision with their own balance sheets is unrealistic. Like utilities are doing with this new generation of AI data centers, the grid will require structures where third-party balance sheets can be used to finance some portion of this transition. That is why Craig and I and the broader utility practice at Deloitte are prioritizing distribution planning to identify and develop advanced automations to locate, analyze, and initiate these types of market relationships.

    Many utility executives across our client portfolio say that building a new utility from scratch would be easier than modernizing what we have. This brownfield reality means that current accounting lifecycles, IT, and operating procedures will determine the pace and priority of advancement, and are likely to misalign with how a greenfield program would sequence investments. This is why we will continue to counsel clients to increase stakeholder engagement so that customers, local leaders, regulators, and others understand the tradeoffs.

  5. If you were the CEO of a utility today, what would be the strategic priority you would recommend to your board today to achieve this vision of the future?

    Christian Grant: Utilities must invest in foundational elements, specifically, scalable data architectures and robust communication networks that extend out to the grid edge. Without these basics in place, utilities will struggle to fully realize the potential of advanced analytics or automated systems. Utilities must avoid fragmentation and insist on interoperability, making sure data from diverse systems can be seamlessly integrated and leveraged across all operational domains.

    Additionally, workforce readiness is crucial. We can deploy cutting-edge technologies, but if we don’t prepare our people, we risk diminishing the returns on these investments. Proactively investing now in workforce upskilling and training programs will ensure that employees have the necessary skills to interact effectively with AI-driven systems and automation.

    There are working components of grid infrastructure that are over fifty years old, so we won’t be able to simply flip a switch and have this AI-powered grid. The required work is to identify the critical path forward that builds accretive value for ratepayers and manages a system that is increasing in size and complexity.

    Craig Rizzo: Christian is hitting on a very important point here, this isn’t a greenfield on which we can build a whole new grid. We need to recognize that each utility is in a different starting position and so the first “no regrets” moves, as we put it, may be different as well.

    Some may need to refresh and rationalize their tech stack. Some can reexamine workflows and processes for opportunities for automation. Ultimately, this will be a complex transformation that will require orchestration of many capabilities, from processes to people and organization, policies and governance, and, yes, data and technology.

    That said, we see time and again that data quality is a significant hurdle for our utility clients that are interested in deploying AI & ML technologies. Getting that right, establishing rigorous and sustainable data governance, is essential for any automation implementation. The pieces Christian mentioned, the architecture and communication infrastructure, are critically important to manage the scale and density of data that will be coming in. I’d emphasize that starting with robust data governance that delivers high quality, secure data, is going to be a key “no regret move” that enables many (if not all) advanced Feeder of the Future capabilities downstream.

    This is something that can be done today. Starting with this approach will enable utilities to validate the value of their technology and build operational confidence, setting a strong foundation for more extensive automation down the road.

    The lesson? You can start small, but start with focused initiatives that validate capabilities and show value.

Did you find this useful?

Thanks for your feedback