Skip to main content

Incentives are key to breaking the cycle of cyberattacks on critical infrastructure

The path to protecting critical infrastructure from cyberattack may lie not through new technology, but through a better understanding and shaping of incentives

Interplay of incentives

A marshmallow. That may be the secret to improving the cybersecurity of critical infrastructure. 

Many may know of the famous marshmallow experiment conducted by Walter Mischel at Stanford University where children were offered a marshmallow but promised two if they could resist eating the first one for a given period. But less well known are the follow-on experiments that looked at how cooperation and social pressure changed children’s behavior in the experiment. Researchers presented children with the marshmallow task, but also told them that getting two marshmallows was dependent on another child also not eating their marshmallow. Ironically, kids who were dependent on others were more likely to wait for the reward than those who were solo, indicating that working together was more effective than going at it alone.1 The incentives toward collaboration and social connection worked against the incentive toward short-sighted self-interest.

The same themes resonate when discussing the cybersecurity of critical infrastructure. Officially, critical infrastructure can be any of 16 sectors ranging from the expected, such as nuclear and chemical, to the perhaps more unexpected, such as agriculture and rail car manufacture. But the proper functioning of these sectors doesn’t stop at just the companies involved—there are many critical functions that require the support of a wide range of stakeholders, from software companies to internet and web-hosting service providers to regulators.2 The success of security strategies such as defense in depth or layered defense depends on all of these stakeholders working toward a common goal. But importantly, each of these stakeholders has a different set of incentives pushing and pulling on their behavior.

Even adversaries are incentivized by different trends to increase or decrease their attacks. The challenge is that in such a complex environment as critical infrastructure, the incentives of one player may combine with the incentives of other players in unexpected ways, often leading to actions that look individually rational but have irrational effects at the industry level. 

Securing critical infrastructure from cyberattacks takes more than defending critical infrastructure assets; it requires an understanding of the incentives of all those stakeholders and then shaping them. If we can harness the positive incentives toward collaboration and social connection, then, just like the children in the experiment, we can enjoy the reward—perhaps not a marshmallow, but more resilient critical infrastructure that is available when citizens need it most. 

Threats to critical infrastructure are outpacing protections

Attacks targeting critical infrastructure are nothing new. From cutting off the water supply to a besieged city to the Allied strategic bombing campaign in World War II, adversaries have always sought to use critical infrastructure as leverage against opponents. However, the need to physically attack infrastructure typically limited these attacks to wartime. Today, trends in digital technology and international relations have come together to make the threat to critical infrastructure not only more common, but also potentially more dangerous as well.  

Threats to critical infrastructure are increasing

Tech trends driving increasing vulnerability. The increasing computing power and falling size and cost of processors, memory, and batteries mean that the physical and digital worlds are blending. Objects that had been purely physical, such as pumps and valves, may now have digital sensors or controls. Those digital devices at the edge (sensors, controllers, Internet of Things) are then often linked to the core IT networks (data storage, enterprise software) that may themselves be connected to the wider internet. This convergence of information and operational technology (IT and OT) can make every valve, switch, and pump in a critical infrastructure operation a computer potentially accessible to the internet, vastly increasing the challenge of securing them.

While these physical-digital devices help boost efficiency, they can also make security more difficult in two ways. First, they have led to a proliferation of devices that must be protected. With an estimated 46 billion connected devices in 2021, a number that doubles just over every three years, it is not much of an exaggeration to say that the attack surface that must be defended is nearing infinity.3 While only a small percentage of those end points may belong to critical infrastructure, the trend of a growing attack surface impacts the cybersecurity of critical infrastructure. Not only does it increase the technical challenge of trying to secure all of those end points, but it also increases the human/organizational problem of having to collaborate with even more manufacturers, vendors, and contractors to maintain the security of all those systems. This translates into a significant increase in the risk faced by critical infrastructure, given that about 85% of all data breaches result from human error.4

Second, the convergence of physical and digital worlds makes the consequences of attacks harder to predict and, potentially, more damaging. While the security of information and operational technology is different, increased connectivity is driving their security considerations together. In a world where digital systems can control physical outcomes, digital attacks can have catastrophic consequences in the physical world as well. The first recorded cyber-physical attack against critical infrastructure saw a disgruntled former employee use radios to send faulty commands to industrial control systems at a wastewater plant, resulting in the release of 800,000 liters of sewage into a local community.5 Even more concerning is that the interconnections of modern commerce and the difficulty in attribution of cyberattacks blur the lines between what is simply one company’s problem and what is a national security crisis. For example, a criminal gang knocking a school district’s network offline may be a matter for law enforcement, but a nation-state cyberattack causing physical damage to a steel plant, for example, could be seen as a clear act of war.6

Economic and international trends encourage actors to act on those vulnerabilities. More than just technology is driving the increase in cyberattacks. Rising geopolitical tensions, difficulty in attribution, and the increasing balkanization of technology ecosystems encourage nation-states to see cyberattacks as an effective tool below the threshold of armed conflict.7 International tensions give nation-states the motivation to attack, while balkanized tech ecosystems allow them to attack with greater assurance of avoiding the consequences of either adversary responses or unintentional blowback on their own systems. These drivers have played a role in the significant increase in nation-state–sponsored attacks in recent years, an increase that some researchers have measured at up to 100% over the past three years.8

But nation-states are not the only threats. The critical nature of this infrastructure also makes it a lucrative target for cybercriminals who see owners as being more likely to pay ransom to avoid disruption.9 Not only has the potential benefit of attack risen, but the means of attack are also becoming more available. The emergence of malware-as-a-service, along with the escrow and dispute resolution services that facilitate deals on the dark web, have effectively lowered the barrier to entry into cybercrime. Attackers no longer need to be skilled hackers; rather, they just need access to criminal marketplaces and a few dollars to buy readymade malware from thriving businesses that sell malware as a service.  

Defensive efforts to date have largely been ineffective

While technology and international trends may be driving an increase in cyberattacks against critical infrastructure, the threat itself is not new. The Federal government has been working on the problem since 1996, when Executive Order 13010 defined “critical infrastructure” for the first time and established the National Commission on Critical Infrastructure to protect it. Successive executive orders and policy directives further refined the structure and responsibilities for protecting critical infrastructure.  

However, even with that early focus on both critical infrastructure and cyberthreats specifically, the number and severity of attacks have increased.10 The question then is “why?” Why haven’t we been able to protect the national critical infrastructure, despite the resources and talent at our disposal? National cyber director Chris Inglis sees this as a problem of how we all work together. “We don’t actually defend these systems as a collaborative endeavor such that they have to beat all of us to beat one of us … It’s not to say we don’t have some very talented people and we don’t have some really great technology, but we’re not really joined up to solve this problem in a way that’s required.”11

In critical infrastructure sectors, the idea of working together is not new, and the concept of “collective defense” is well-known in cyber circles. So, what is standing in the way of progress toward that vision of defending collaboratively? The very incentives that push and pull the different players involved.

A tangle of incentives may be the problem

If cybersecurity of critical infrastructure is a known and important problem and yet progress toward greater security has been slow, it implies that there are other pressures on peoples’ decision-making. In other words, there are incentives tugging many stakeholders—including owners of critical infrastructure—away from actions that support security.12  

There are clear incentives for individual stakeholders to act in ways that may not support the long-term security of critical infrastructure. Take attackers for example. The sheer amount of money that can be made from ransomware attacks alone provides a strong incentive for criminals of every stripe. In fact, our research into ransomware has found a clear correlation between the size of ransom demand and the volume of attacks. The more money to be made, the more attacks.

Despite the fear of being the target of such attacks, critical infrastructure owners may see little incentive to improve security beyond the bare bones. Profit motives and thin margins in many of these industries often mean there’s little money left for costly investments in cybersecurity. And when incidents do happen, incentives to protect brand or minimize liability can often lead owners or operators of critical infrastructure to be reluctant to share information about vulnerabilities and incidents, further increasing the risk to other owners/operators. Nor are infrastructure owners the only group whose incentives can lead to more insecure behavior. Incentives to be first to market and maintain low costs can even lead manufacturers in some tech sectors such as Internet of Things and embedded systems to market insecure products.13

Incentives driving individual stakeholders may make their choices difficult, but these incentives are known and can be managed. The real challenge is the swirl of incentives when all stakeholders begin to interact. Incentives can add up in odd ways. An individual actor making a rational choice based on its own personal incentives can unwittingly impose higher costs on itself due to the incentives of other players. This is the generalized form of the tragedy of the commons: It was rational for each individual owner to graze their sheep on common land as much as possible, but the sum of those incentives was an outcome no one wanted, the destruction of the common lands.

The exact same phenomenon can occur in cybersecurity. The national cyber director, Chris Inglis, describes it as “proactive ambivalence.” The confusing nature of the cyber ecosystem can mean that even in the face of massive, disruptive cyberattacks, individual stakeholders can have little incentive to change. “We’re generally aware as a society that something is amiss,” says Inglis. “You can’t miss this. You can’t stand there and watch the news reports and believe that nothing is amiss. Where the proactive ambivalence comes in is, we all believe it’s somebody else’s problem.”14

While the traditional solution to such “tragedies of the commons” is government regulation, that can be difficult in an ecosystem with as many players as cybersecurity. Rather, government may be able to shape the incentives of stakeholders to indirectly encourage them to take appropriate actions. Just like changes to Section 401K of the tax code encourage personal retirement savings, government can help jump-start new action on cybersecurity. But shaping incentives first requires a clear understanding of how the actions of all stakeholders influence one another. Using the analytical tool of causal loop diagrams (see the sidebar, “Using causal loop diagrams to tease apart complex problems”), we have created a simplified picture of those interactions. With that picture, we can begin to identify where incentives are adding up in unintended ways, and even where changes can begin to reshape those incentives to help improve cybersecurity.

Using causal loop diagrams to tease apart complex problems

The web of interactions that is the cyber ecosystem may mean that no single actor can accomplish much alone, but it also means that by mapping out the loops in those interactions, we can identify where stakeholders’ actions come together to either improve or degrade overall security. The causal loop diagram is an analytical tool designed to create that literal map of stakeholder interactions. Each box in the diagram is an action taken by a stakeholder. The boxes are then connected if that action makes another action more or less likely. 

Once the full map of interactions is drawn, we can trace the lines of influence to see where they create feedback loops that either incentivize further attacks (called reinforcing loops in the literature) or disincentivize them (called balancing loops). These reinforcing and balancing loops can help identify where the seemingly rational incentives of single stakeholders, when layered with the competing incentives of other stakeholders, can create undesirable results. 

But the causal loop diagram is not just a descriptive tool. Because it lays out actions and incentives, it can help guide interventions. Looking at a particular loop from the perspective of a government regulator, for example, it can become clear which actions they may want to incentivize/disincentivize to reduce the risk of cyberattacks.

Show more

Reshape incentives to protect critical infrastructure

The complex mix of incentives across all stakeholders is a massive challenge, but it can also offer the path to a solution. If incentives stand in the way of the adoption of better security procedures or more effective information-sharing, then reshaping those incentives can be an effective way to make progress toward more security.

There are many ways to reshape incentives for individuals, organizations, and even adversaries. Economists, philosophers, and legal theorists have argued over them for centuries. One useful categorization is to think that incentives can be shaped by enforcement, market, reputational, and moral pressures (figure 2).15 Our mapping of the tangled web of incentives across the various cyber stakeholders can help show not only where those pressures can be exerted, but also who has the ability to exert them.

Enforcement pressure. The most direct path to reducing cyberattacks is to target the incentives of the attackers themselves. Reducing attackers’ motivation to attack is difficult, but given the relatively finite set of attacks, it can often still be preferable to trying to secure the near-infinite attack surface of today’s critical infrastructure. Our map of incentives in the cyber domain shows that defense and intelligence organizations have two main levers to influence attacker motivations: They can disrupt the confidence of attackers by “defending forward” in the digital domain or they can reduce the perceived legitimacy of attacks by using influence operations in the cognitive domain. 

For example, following a series of attacks carried out by a state-sponsored hacker group, Dutch intelligence hacked back the group. The “defend forward” approach allowed the agency to get access to the hacker group’s systems and cameras, enabling the agency to get confidential information and even warning their international allies of impending attacks.16 Such actions can dent the confidence of the adversary to attack in future. For more on how government agencies can scale these enforcement actions, see our article on Government’s role in deterring cyberattacks.

Figure 3 shows the anatomy of this loop of incentives: Cyberattacks encourage defense and intelligence organizations to increase information and “defend forward” operations. Defend forward operations decrease the confidence attackers have in their ability to successfully carry out attacks and so reduce the number of attacks. Information operations reduce the perceived legitimacy of cyberattacks, thereby reducing the attackers’ motivation to conduct more attacks.

Show more

Market pressure. Shaping the incentives of attackers can only go so far. Systems should be minimally secure. Part of the problem is that in today’s tragedy of the commons, infrastructure owners can be incentivized to push their own costs onto society. For cyberattacks, that means avoiding the cost of better cyber defenses and allowing society to absorb the costs of any attack that may occur—whether in the form of lost services or government response to an attack. 

To remedy this, the full societal cost of potential attacks needs to be built back into infrastructure owners’ calculations. One way to reflect the true societal cost of cyberattacks is to penalize those who fail to meet basic security standards. For example, the Federal Trade Commission recently warned companies to patch the Log4j vulnerability or face legal actions, including penalties.17 Another way is to ensure that products such as cyber insurance reflect the true cost of attack and recovery. Rising cyber insurance costs that reflect the massive costs of responding to cyberattacks may help encourage infrastructure owners to invest more in cyber defenses.18 Further, some insurers also require organizations to adhere to baseline security practices to prevent the attack or reduce disruption in case of an attack.19 

Figure 4 shows the cyber insurance reinforcing loop of incentives. Successful attacks can increase the rate at which targeted industries buy cyber insurance. In some cases, that cyber insurance can be used to pay a ransom if attacked. The payment of ransom, in turn, encourages attackers to attack more.

Show more

Paying more is not the only form of economic incentive. There can also be positive economic pressures that encourage more secure behaviors—for example, the opportunity for companies to make money by filling a needed role in the cybersecurity ecosystem. 

Our map of incentives uncovered a few responses to cyberattacks that function like AND gates where an appropriate action can only be taken when two different stakeholders have the same information about an attack. For example, taking a malware marketplace offline requires not only law enforcement with the legal authority to seize websites and servers, but also denial of key services by web hosting and internet service provider (ISP) companies. 

The takedown of Emotet, the world’s largest botnet, is a prime example. Europol, EU’s law enforcement cooperation agency, worked with the law enforcement agencies of eight countries and private security researchers to disrupt Emotet malware.20 With infected computers spread across 90 countries, Europol needed to not only coordinate with legal authorities and law enforcement agencies in eight countries, but it also needed the technical expertise of technology companies. In the global take-down, law enforcement agencies and a large group of security industry players collaborated to hijack hundreds of Emotet’s command and control servers.21 In the United States, threat intelligence company Team Cymru was one of those companies that worked with the FBI in the operation. The company detailed and validated IP addresses of Emotet’s controllers and recruited network operators to help take down the servers.22

Without a common picture of the threat shared across law enforcement and commercial companies, this type of action would be impossible. For these types of operations to be successful, there needs to be an organization brokering the sharing of information between the different parties. In the Emotet example, Europol filled much of that role because of its expertise and relationships. But in other cases, the needed expertise and trusted relationships may lie outside of government.   

This situation can create a classic need for brokerage where trusted players can help facilitate the rapid movement of information between stakeholders. Just like brokerage in other industries, from oceanic shipping to choosing a restaurant, this economic opportunity can attract players to help improve the efficiency of the whole system. In the case of cybersecurity, the need goes beyond mere information-sharing and into connecting technical knowledge with threat data and knowledge of government authorities. These connections also need to happen at machine speed, which means that a brokerage solution could look more like a platform such as the Bloomberg Terminal, where users can subscribe and be connected as their needs align.

Figure 5 shows the anatomy of the “AND gate balancing” loop of incentives; it features two loops that must overlap to succeed. A successful attack against government may increase the sharing of threat information with law enforcement, leading to legal authorities taking down a malware marketplace (orange loop). But taking down that marketplace is only possible if the relevant technology companies are also aware of the details of the attack at the same time and can act to deny the services needed by the marketplace (purple loop). Only then will the marketplace be taken down completely, depriving attackers of the ability to conduct further attacks.

Show more

Reputational pressures. Reputation is another area where both positive and negative pressures can share incentives. We are all familiar with negative reputational pressures, the bad press and brand perception that can come from falling victim to a cyberattack. However, this bad press can serve a good purpose. If harnessed, it can be an important incentive encouraging critical infrastructure owners to invest more in cyber defenses.

Figure 6 shows the anatomy of the public-shaming balancing loop of incentives. An attack resulting in a public data compromise can lead to public outcry that motivates greater investment in cybersecurity, thereby making further attacks more difficult.

Show more

But there are also positive reputational pressures that can be even more effective. By telling positive stories of companies that did the right thing and the results it produced, a few positive outliers can serve as exemplars, pulling everyone’s behavior in positive directions. For example, imagine a technology service provider that is attacked, but rather than sweeping the incident under the rug, it divulges the information quickly to the right government authorities. Law enforcement is then able to take action while the trail is still hot and arrest the perpetrators. One good example of such a story is Microsoft’s recent action against the Necurs botnet. To eliminate the botnet, Microsoft obtained legal authority to take control of Necurs servers in the United States, worked with domain registrars in multiple countries to prevent Necurs from registering new domain names, and even worked with ISPs to help uninstall Necurs malware from infected computers. Similar good news stories of commercial-led cyber defense could be an important balance to the instinct to hide bad news.

Figure 7 shows the anatomy of the commercial-led balancing loop of incentives. Often, a technology provider may be the first to become aware of a cyberattack. That technology company can then not only take steps to deny critical services to attackers, but it can also share information with law enforcement to gain appropriate legal authorities to do so. This commercial-led activity can then remove marketplaces or other tools that attackers rely on, reducing their ability to conduct further attacks.

Show more

Moral pressures. Talking about moral pressure may seem out of place in a discussion on cybersecurity, but especially when dealing with large groups of people, common conceptions of what is right can be important pressures. For example, two of the largest and often-overlooked stakeholder groups in cybersecurity are users and the public. Both can create strong positive or negative pulls on cybersecurity. For example, users’ desire for greater functionality and ease of use can often run counter to cybersecurity tools that restrict features or access. Similarly, public desire for limited government spending can shrink resources for cybersecurity.23 But the public can also be a force for better cybersecurity. Public pressure following high-profile cyberattacks has been an important impetus to improving cyber defenses.24

Communicating the value of cybersecurity to these groups—in terms that they can understand and value—can help set up cybersecurity as one of the many more “goods” that people balance in making decisions. This can help users be more accepting of limited functionality if it makes their data more secure or the public more willing to support greater government investments in cybersecurity.

Figure 8 shows the anatomy of this loop of incentives. Users’ desire to have maximum functionality and ease of use in systems can, at times, exert a pressure that reduces investment in cybersecurity. Similarly, the general public’s perfectly reasonable desire to see responsible use of public spending can combine with other budget incentives within government organizations to have a similar pressure to reduce cybersecurity investments.

Show more

Getting started

From the categories of pressures that can reshape incentives, we can see that some actions are more suited to certain stakeholders than others. While there is no single silver bullet for cybersecurity, there are a set of actions that every stakeholder can begin to take today to help reshape the cyber environment.

1. Scope the problem: Inventory and monitor critical infrastructure assets. Critical infrastructure industries and government agencies should work together to inventory and monitor critical assets. If we can’t see the critical assets, we can’t defend them. The Department of Energy (DOE) launched a 100-day action plan to increase real-time information-sharing, visibility, detection, and response capabilities of operational technology in the electricity sector. The CEO-led Electricity Subsector Coordinating Council of electricity companies liaised with the DOE and deployed a technology tool that could provide visibility into electric systems. The initiative, known as Neighborhood Keeper, improved the visibility and monitoring of US electrical systems from 5% to 70%, while keeping the data anonymous and protecting companies’ privacy. Information about threats and vulnerabilities is shared real time with each participant and E-ISAC (Electricity-Information Sharing and Analysis Center) for the collective defense of a critical infrastructure sector. Many companies in the water and gas sectors are also adopting a similar approach and technology to protect against cyberattacks.25

2. Make connections: Understand your organization’s connections in the cyber ecosystem and build personal relationships across them. The tangle of incentives in our maps shows the complexity of the cyber ecosystem. Every stakeholder should understand their role in the ecosystem—whom they can influence and who influences them. This can help government and technology companies alike find new opportunities to reduce attacks and improve critical infrastructure defenses. But that level of collaboration is only possible with relationships of personal trust. A critical infrastructure owner is only going to share the details of a cyberattack that may not only prove embarrassing but could also reveal some trade secrets, if they trust not just the organization, but the specific individual at the other end. Exercising incident response playbooks with multiple stakeholders can help build the needed trust between government, tech providers, and critical infrastructure owners. While some ISACs run rehearsals or offer response tools, making the exercises more regular and widespread is a key aspect of building the human trust needed to react quickly in the event of a crisis.26

3. Set minimum security standards. Use regulatory and financial tools to ensure basic cyber hygiene for all. All of the complicated relationship-building and information-sharing is for naught if trust is immediately lost via a data breach or if critical infrastructure is left unprotected. Every organization, whether critical infrastructure, government, technology company, or third party, should put in place minimum sets of security standards calibrated to the function of critical infrastructure and impact of its loss. For government, this means considering the use of regulatory power to set minimum cybersecurity standards for all IT goods sold. This can be done via hard regulations, such as government-defined minimum safety standards for automobiles, or soft regulations, such as the Underwriters Laboratory seal of approval on compliant household goods. 

But setting minimum standards is not solely a task for government. Everyone, from tech companies to infrastructure owners to banks, has a role to play: 

  • ISPs and cloud service providers could work together to create “comply to connect” schemes where devices will be unable to connect to the internet unless they are up to date on OS updates and other key patches.
  • Banks and venture capitalists can use their financial levers to encourage security to be baked into earlier stages of product development. 
  • Infrastructure owners should implement multifactor authentication (MFA), adopt zero-trust architectures, and require cyber hygiene training for all users. These minor changes can have a significant impact. In fact, research indicates that MFA can block 99.9% of automated attacks on systems.27
  • Government should create a national cyber hygiene campaign to educate all citizens about the basic operations of the technology they use every day and how to protect themselves from common threats.

4. Harness market forces to do more: Economic incentives can drive greater confidential information-sharing. To go beyond the minimums of cybersecurity requires more than just penalties; it takes opportunities. By tapping into market forces, government and critical infrastructure players can encourage a mindset where cybersecurity is not an afterthought, but a central piece of business. 

These market incentives could also help attract of new players to fill the critically needed brokerage role between government and tech companies in cyber incident response. If the government commits to funding such a role, it could greatly improve information flow to defenders and increase the chances of attackers being identified and foiled. 

However, historically many organizations have been reluctant to share information rapidly due to public disclosures, liabilities from the breach, reputation damage, and fears of class action lawsuits. This reticence can be overcome in two ways. First, governments can consider the Federal Aviation Administration’s aviation safety reporting systems that are premised on nonpunitive, anonymous reporting to regulators and communities about aviation threats.28 Second, it can help companies “win” by sharing information. Currently, only negatives can arise from sharing details of a cyberattack, such as lawsuits and reputation damage. But if companies could gain positive coverage, it could help change the dynamic. If government could work with companies to help counter or even arrest attackers, it could give them a reputation boost in the market, which in turn could help encourage further information-sharing. 

Closer working relationships such as Cybersecurity and Infrastructure Security Agency’s (CISA) new Joint Cyber Defense Collaborative can help make this a reality, but clearer ideas about who to report information to and how are still needed. For government, this means having a single door that critical infrastructure industries and technology partners can use. Then, that lead agency can fuse received information with other useful information to further disseminate it to those who need it in industry, government, and beyond. This level of sharing will likely require creative approaches to tiered levels of reporting for sensitive information (via automated tear lines), rapid analysis to support standardized threat reporting, and automated distributions along industry verticals.

This is just the beginning …

In recent years, cyberattacks on critical infrastructure have had a far-reaching impact on Americans. But with no stakeholder able to tackle the problem alone, progress is only possible if we create incentives for stakeholders to work together. Reshaping the incentives of an entire industry may be difficult, but it is possible. Even children were able to collaborate with only a marshmallow as incentive. We have the safety of our critical infrastructure as an incentive. What are we waiting for?

Deloitte Cyber

As a recognized leader in cybersecurity consulting, Deloitte Cyber includes thousands of dedicated cyber professionals, across numerous industry sectors, who help clients better align cyber risk strategy and investments with strategic business priorities, improve threat awareness and visibility, and strengthen their ability to thrive in the face of cyber incidents. In the realm of Cyber Everywhere, the ubiquity of cyber drives the scope of our services. Deloitte Cyber advises, implements, and manages solutions in strategy, defense, and response; data security; application security; infrastructure security; and identity management. 

Learn more

  1. Jill Suttie, “Kids do better on the marshmallow test when they cooperate ,” Greater Good Magazine, February 24, 2020.  

    View in Article
  2. Cybersecurity & Infrastructure Security Agency, “National critical functions ,” accessed February 2022.  

    View in Article
  3. Juniper Research, “‘Internet of Things’ connected devices to triple by 2021, reaching over 46 billion units ,” December 13, 2016.  

    View in Article
  4. Verizon, 2021 Data breach investigation report , accessed February 2022.  

    View in Article
  5. Marshall D. Abrams and Joe Weiss, “Malicious control system cyber security attack case study: Maroochy Water Services, Australia ,” MITRE, August 2008.   

    View in Article
  6. Department of Homeland Security, Commodification of cyber capabilities: A grand cyber arms bazaar , accessed February 2022.  

    View in Article
  7. For a description of how increasing tech balkanization encourages nation-state cyberattacks, see Jesse Goldhammer et al., Leading the way with an adversary focus: Government’s role in deterring cyber attacks , Deloitte Insights, August 4, 2021. For more on how geopolitical tensions can drive cyberattacks, see CISA Insights, “Increased geopolitical tensions and threats ,” January 6, 2020.

    View in Article
  8. HP, Nation states, cyberconflict and the web of profit , April 8, 2021.

    View in Article
  9. Rishi Iyengar and Clare Duffy, “Hackers have a devastating new target ,” CNN, June 4, 2021.

    View in Article
  10. Stephanie Jones, “Protecting the United States’ critical infrastructure from cyberattacks ,” Texas A&M Today, November 2, 2021.  

    View in Article
  11. CBS News, “National cyber director Chris Inglis on deterring cyber threats—“Intelligence Matters” podcast ,” November 24, 2021.

    View in Article
  12. Ross Anderson, “Why information security is hard—an economic perspective ,” University of Cambridge, accessed February 2022.

    View in Article
  13. Bruce Schneier, “Security economics of the Internet of Things ,” Schneier Blog, October 10, 2016.  

    View in Article
  14. CBS News, “National cyber director Chris Inglis on deterring cyber threats .”

    View in Article
  15. Our categorization of incentives comes from Stephen Levitt and Stephen Dubner’s book Freakanomics where they categorize “three basic flavors of incentive: economic, social, and moral.” Our categorization of the levers than can shape those incentives is a combination of Lawrence Lessig’s norms, markets, laws, and architecture and Bruce Schneier’s moral, reputational, institutional, and security.

    View in Article
  16. Catalin Cimpanu, “Netherlands can use intelligence or armed forces to respond to ransomware attacks ,” Record, October 7, 2021; Sean Gallagher, “Candid camera: Dutch hacked Russians hacking DNC, including security cameras ,” ARS Technica, January 26, 2018.

    View in Article
  17. Tonya Riley, “FTC warns of potential penalties for firms that fail to fix Log4j software flaws ,” Cyberscoop, January 4, 2022.

    View in Article
  18. US Government Accountability Office, Insurers and policyholders face challenges in an evolving market , May 2021.  

    View in Article
  19. Institute for Security and Technology, Combating ransomware , accessed February 2022.  

    View in Article
  20. Andy Greenberg, “Cops disrupt Emotet, the internet’s ‘most dangerous malware’ ,” Wired, January 27, 2021.  

    View in Article
  21. Ibid.

    View in Article
  22. Lance Whitney, “Emotet malware taken down by global law enforcement effort ,” Tech Republic, January 27, 2021.

    View in Article
  23. The inherent tension between wanting more/better services and where funding for those services should come from can be seen in research by the Pew Research Center.

    View in Article
  24. David E. Sanger and Nicole Perlroth, “Pipeline attack yields urgent lessons about U.S. cybersecurity ,” New York Times, June 8, 2021.

    View in Article
  25. Energy Commerce, “Countering ransomware in critical infrastructure ,” July 20, 2021.

    View in Article
  26. Examples of such exercises and tools include those from the E-ISAC (Maggie Miller, “Hundreds participate in electric grid cyberattack simulation amid increasing threats ,” Hill, November 18, 2021) and FS-ISAC (FS-ISAC, “Exercises: Build stronger plans and a more resilient business ,” accessed February 2022).

    View in Article
  27. Catalin Cimpanu, “Microsoft: Using multi-factor authentication blocks 99.9% of account hacks ,” ZDNet, August 27, 2019.

    View in Article
  28. US House Committee on Oversight and Reform and the House Committee on Homeland Security, “Prepared statement of Kevin Mandia, CEO of FireEye, Inc. ,” February 26, 2021. 

    View in Article

Authors would like to thank Thirumalai Kannan D and Matt Stapleton for contributing with causal loop diagrams. The authors would also like to thank Andrea RigoniSean MordhorstJesse GoldhammerCarey MillerAnthony FrattaJoseph PriceIan FlemingWilliam D. EggersBruce Chew, and John O’Leary for contributing their time and insights to the report.

Cover image by: Sonya Vasilieff

Did you find this useful?

Thanks for your feedback

If you would like to help improve Deloitte.com further, please complete a 3-minute survey