Skip to main content

Imagine it’s May 2028. Your organization has just taken ownership of a brand-new data center, and you’re thrilled at the volume of workloads it can handle while keeping cloud costs under control. You’ve finally figured out and implemented a new hybrid infrastructure strategy. Then your CEO forwards you an article about quantum computing and asks how it fits into the picture. How will you respond? Will you need to reengineer your systems?

The scenario may still be hypothetical, but it could be all too possible in the coming years. “Based on vendor road maps, we expect to see the first commercially relevant quantum computing use cases in the next 24 to 36 months,” says Scott Buchholz, the US-based quantum computing lead for Deloitte Consulting. “As companies make infrastructure plans for growing workloads, they should be thinking about quantum computing and how it might need to be integrated into evolving high-performance computing strategies.”

The eventuality of QPUs

For many years, enterprises “got most of their horsepower out of mainframes and traditional CPUs,” says Diana Kearns-Manolatos, technology transformation research leader in the Deloitte Center for Integrated Research. Today that’s changing. “Organizations are making different choices based on their overall priorities, shifting to increasingly hybrid strategies with data hosted and algorithms running across public and private cloud, edge solutions, and private data centers,” she says. “Data sovereignty, cost, speed, latency, and control all factor into these hybrid computing strategies.”

While many road maps are focused on achieving mass scale capabilities, organizations may be missing future interoperability and hybrid computing connection points if their view is too short-sighted. Sooner than many might expect, QPUs—or quantum processing units—could be part of the mix for some enterprises, says Mekena McGrew, the US-based quantum information lead for Deloitte Consulting: “The preparation needed for enterprises to integrate quantum algorithms and the QPUs to process them into their infrastructure road map is considerable.”

As lasers are to light bulbs

There are fundamental differences between quantum and classical computers that will require reframing business problems with quantum information science principles in mind, Kearns-Manolatos says.

Indeed, quantum computers are as similar to classical computers “as lasers are to light bulbs,” Buchholz says. “They might both emit light, but they are not substitutes for one another.” As a result, “there are some problems that may be tractable to quantum computers that are not tractable to today’s supercomputers.”

“There will likely be a whole host of architectural challenges for which many enterprises are not yet prepared.”
—Mekena McGrew, quantum information lead, Deloitte Consulting LLP

In some ways, quantum computers are like “a flavor of supercomputer,” McGrew says. “But quantum computers solve problems very differently than today’s supercomputers do, using different math and different techniques.” To wit: Breaking existing encryption techniques—a task that would likely take hundreds of thousands of years on today’s supercomputers—may take weeks or even just days on future quantum computers.

Increasingly, state-of-the-art research is being run jointly on supercomputers and quantum computers working together, Buchholz says. “In the future, you might run CPUs, GPUs, and QPUs along with all of your other infrastructure to try to make the most effective and efficient system possible for solving really complex problems.”

It can be more predictive

There are numerous examples of problems that could benefit from a hybrid quantum-classical approach, including precision molecular design and financial services applications such as options pricing, risk modeling, or stress testing, McGrew says. Consider a team that aims to discover a chemical with particular properties. It could start by using machine learning to refine the scope, then send the problem to a quantum computer, which performs more exact calculations. Those calculations could then be fed back into the machine learning model to improve the model’s own predictions through reinforcement learning. The team can then continue to run the two processes in parallel until it gets down to a small enough set of target chemicals. From there, it can shift to the lab to confirm the results.

“When it works, this combination can be better in terms of predictive power and capability than one or the other alone,” Buchholz says. “It can be more predictive than machine learning, which is heuristic, and takes advantage of the fact that quantum computers can do some of these calculations more quickly or accurately.”

Getting quantum-inspired

Given the long time horizons and capital build-outs that may be needed to realize quantum-minded infrastructure expansion, it’s important for leaders to think ahead now, while making those plans, rather than have to react when quantum’s time comes, Buchholz says.

Adds McGrew, “There will likely be a whole host of architectural challenges for which many enterprises are not yet prepared.”

The first step is to determine when quantum computing is likely to have value for the organization, Buchholz says. “There should be a clear business need that a quantum computer is likely to help advance.”

Another consideration is the importance of having a team with the necessary skills. It’s not a matter of simply sending people to training for a week: “Quantum technologies work so differently that it can take a couple of years to learn how to use them efficiently and effectively,” Buchholz says.

One approach that can help build experience without the need to buy QPUs is quantum-inspired computing, McGrew says. “Companies can solve some complex problems using just the cloud or GPUs in their data center to help prepare them for advanced workloads that may include QPUs in the future,” she says. “It’s a way to think about the future of high-performance computing architecture and start building in a way that could last throughout the next generation of technology.”

The time has come to get strategic

For some business leaders, understanding quantum’s implications might be the 11th item on their “top 10” list of things to do, Buchholz says. But at some point, organizations that run high-performance computing will likely want to offload at least parts of their business problem-solving onto quantum computers. “People who run existing high-performance computing clusters and supercomputers should be thinking about not just how they’re going to do the integration, but also what technologies are available given their stacks as they think about building their next generation,” he says.

Quantum computing will likely arrive “sooner than most people think,” and its integration with high-performance computing is “almost a given,” Buchholz adds. “It’s important to make sure somebody in the organization has spent some quality time thinking about what it means for them,” he says. “The time has come to get strategic and develop a road map.”

By

Katherine Noyes

United States

ACKNOWLEDGMENTS

Cover image by: Shutterstock

Knowledge services: Agni Wagh

COPYRIGHT

Related Content