Jeroen Kusters

United States

Deb Bhattacharjee

United States

Jordan Bish

Netherlands

The semiconductor industry is navigating a high-stakes paradox in 2026. While soaring artificial intelligence-driven demand is pushing revenues to unprecedented levels, this boom has its risks. The industry seems to have placed all its eggs in the AI basket, which may be fine if the AI boom continues. But the industry should also consider planning for scenarios in which AI demand slows or shrinks.


The state of the market today

The global semiconductor industry is expected to reach US$975 billion in annual sales in 2026, a historic peak fueled by an intensifying AI infrastructure boom (figure 1).1 Growth reached 22% in 2025 and is projected to accelerate to 26% in 2026, and even if growth moderates thereafter, annual sales of US$2 trillion seem likely by 2036. However, this record growth masks a stark structural divergence. While high-value AI chips now drive roughly half of total revenue, they represent less than 0.2% of total unit volume.2 Another divergence is that, as AI chips are booming, chips for automotive, computers, smartphones, and non–data center communications applications are seeing relatively slower growth.3

The stock market is often a leading indicator of industry performance. As of mid-December 2025, the combined market capitalization of the top 10 global chip companies was US$9.5 trillion, up 46% from US$6.5 trillion in mid-December 2024 and 181% from US$3.4 trillion in mid-December 2023.4 Further, the market cap is highly concentrated, with the top three chip stocks accounting for 80% of that total.

At the time of publication, Deloitte predicts that generative AI chips will approach US$500 billion in revenue in 2026, or roughly half of global chip sales.5 Further, AMD CEO Lisa Su has raised her estimate for the total addressable market of AI accelerator chips for data centers to US$1 trillion by 2030.6

In 2025, an estimated 1.05 trillion chips were sold at an average selling price of US$0.74 per chip.7 At a rough estimate, although gen AI chips are likely to account for about 50% of industry revenues in 2026, they are less than 20 million chips, or roughly 0.2% of total volume.8 Even though global chip revenues in 2025 are expected to rise 22%, silicon-wafer shipments increased by only an estimated 5.4% for the year.9

In terms of key end markets, personal computing device and smartphone sales, which were anticipated to grow in 2025,10 are now expected to decline in 2026 due to rising memory prices.11

About Deloitte’s TMT center outlooks

Deloitte’s 2026 global semiconductor industry outlook seeks to identify the strategic issues and opportunities for semiconductor companies and other parts of the semiconductor supply chain to consider in the coming year, including their impacts, key actions to consider, and critical questions to ask. The goal is to help equip companies across the semiconductor ecosystem with information and foresight to better position themselves for a robust and resilient future.

Revenues for memory in 2026 are likely to be about US$200 billion, or 25% of total semiconductor revenues for the year.12 Memory is notoriously cyclical, and makers appear cautious about overbuilding. As a result, they are increasing capital expenditures only modestly, with much of that going to research and development for new products rather than massively ramping capacity.13 Therefore, the growth in demand for HBM3 (High Bandwidth Memory 3), HBM4, and DDR7 memory for AI inference and training solutions has caused shortages of consumer memory, such as DDR4 and DDR5; prices for these products were up about 4x between September and November 2025.14 Predicting memory supply, demand, and pricing is hard, but some suggest that the current tightness in consumer memory could last a decade.15 Further price increases are likely in quarters one and two of 2026, perhaps as much as another 50%, with, for example, one popular memory configuration reaching US$700 by March 2026, up from US$250 in October 2025.16

This concentration of value appears to have contributed to a shift in market dynamics. As manufacturers prioritize the specialized hardware required for AI training and inference, the resulting “zero-sum” competition for wafer and packaging capacity is already disrupting downstream sectors. For leadership, the 2026 mandate moves beyond simply capturing AI demand to managing the systemic risks of a high-margin, low-volume paradigm, where severe shortages in essential components such as memory are projected to drive 50% price spikes by mid-year and redraw the global supply chain map.

Capitalizing on the AI data center boom in 2026 … with caution

The chip market is heavily exposed to AI chips for data centers, with up to roughly half of industry revenues expected to come from that market in 2026.17 But what might prevent that from happening? And what could that mean for the semiconductor industry, especially if non–data center markets such as the PC, smartphone, and automotive sectors remain weak?

First, those expectations are unlikely to change in 2026. The chips have already been ordered and are in backlog, data centers are under construction, and the numbers for the next 12 months are likely solid. But 2027 and 2028 could diverge sharply from current expectations for reasons noted below:

  • Return on investment: Most organizations building data centers don’t expect to recoup all their money in the first year. But over five- to 15-year periods, there should be reliable revenue flows whose present value generates some level of return for investors. If monetization of AI looks like it may take longer or be lower than expected, data center projects could be canceled or postponed, with an adverse impact on chip sales.
  • Power: AI data centers are expected to need 92 gigawatts of additional electric power by 2027.18 That power may not be available from the grid, and while some “behind-the-meter” gas generation was feasible in 2025, turbines are sold out going forward, making future gas generation increasingly challenging.19 Securing data center permits may get difficult due to the risk of rising electricity rates for consumers.
  • Innovation: Every generation of chips becomes substantially more efficient, likely making the incumbent installed base more of a liability than an asset. AI models used for training and inference appear, over time, to become more efficient and need less computational demand (or “compute,” to industry insiders) for the same tasks.20 These trends may already be baked into data center capex plans, but an orders-of-magnitude breakthrough in either could mean the need for fewer or cheaper chips.
  • Pricing: AI chips are currently expensive and command high margins.21 If new competitive chips are introduced at lower prices, this could have a deflationary effect on the overall chip market, especially pricing.

What could some or all of the above mean for the chip industry over the next one to three years?

Money and market impact: Chip designers and manufacturers that currently benefit from AI tailwinds could face headwinds. Revenue growth could decrease or turn negative. Earnings could be lower. Price-to-earnings and price-to-sales multiples could fall, and market caps could decline.

Fabs, tools, design tools, and more: Since AI chips are high in value but low in volume, a decline in revenues would likely have relatively little impact on companies that manufacture chips or the tools used to make them. Even if AI-chip volumes fall, because AI chips make up a small part of manufacturing capacity, it likely wouldn’t mean that fabs go idle. That said, companies producing certain types of packaging, memory, power, and communications semiconductors could be affected.

Strategic questions to consider

  • If AI chip demand slows in 2026 or beyond, how can chip companies effectively adapt while maintaining high cash levels and low debt alongside their capital spending commitments?
  • Computing chips, memory solutions, and packaging products used in AI data centers are fairly special-purpose in nature. Should data center demand experience a fall or correction, what other end-market opportunities are available for AI chipmakers to pivot toward?
  • Where and how can advanced memory and advanced logic manufacturing capacity be redirected if AI chip demand starts to correct in 2026?

The race for system-level performance: Compute, memory, and network connectivity

With AI data center workloads forecast to triple or quadruple annually between 2026 and 2030,22 chip- and system-level integration will be required to enable system performance in hyperscale data centers. As Deloitte has predicted, chiplets are addressing chip-level performance needs in AI data centers, delivering yield, bandwidth, and energy-efficiency benefits.23 Chip manufacturers in 2026 are likely to increasingly integrate HBM closer to logic chiplets, either on silicon interposers or in 3D stacks, allowing data to move much faster between processors—graphics processing units (GPUs) and neural processing units (NPUs)—and memory (HBM stacks), at multiple terabytes per second, while being more energy efficient (lower joules per bit and lower watts per token).24 Additionally, co-packaged optics (CPO) will likely gain traction in data center switches, enabling higher aggregate bandwidth per rack with a lower Ethernet/InfiniBand switch footprint.25 High-bandwidth flash, which can support faster scale-up (within a server rack) and scale-out (across multiple racks and systems), will likely experience more demand in 2026, especially as AI workloads shift from training to inference.26

However, as traditional copper Ethernet network designs are unable to meet AI workloads, which generate massive east-west traffic between GPUs, optical interconnects (both CPO and linear pluggable optics, or LPO) are likely to see greater adoption in 2026.27 AI network fabric spending is expected to grow at a compound annual growth rate of 38% between 2024 and 2029 (figure 2).28 As AI data center networks scale to 51.2 terabits per second and above switching capacities—within and across racks and clusters—it’s not only critical to integrate the various components (memory stacks, compute systems, and rack-scale networks) but also to re-assess the use of copper or traditional pluggables, which could adversely impact power consumption and bandwidth or take up too much space. This is where CPO and LPO can address those gaps in 2026, as they help shorten electrical paths, reduce power consumption by 30% to 50%, and offer higher bandwidth and better total cost of ownership. 29

Some hyperscalers are using advanced network chips from merchant silicon vendors and disaggregated hardware models in order to develop their own custom topologies on top of those solutions.30 However, in 2026, the industry could increasingly pivot to software-defined network fabrics that integrate compute and networking into a single, vertically integrated solution, given the benefits of superior performance, better orchestration, and lower total cost of ownership.

Even as cloud hyperscalers, AI network companies, foundries, and outsourced semiconductor assembly and test (OSAT) facilities race to address complex heterogeneous system integration challenges, they need to contend with the difficulties involved in next-generation back-end assembly and test processes. For instance, every chip product needs to pass through specific process steps such as modeling, simulation, thermal management, and bumping. These steps require specialized packaging expertise and statistical process control skills that are scarce in the United States and Europe.31 As a result, talent constraints in advanced packaging may continue to hinder regional goals of achieving greater semiconductor autonomy, even as volume-based back-end capacity expands further in Asia.32

Strategic questions to consider

  • Could sourcing and procurement be disrupted by materials constraints (supply and availability of substrates, memory, and interconnects); geopolitics (assembly and test capacities and suppliers in vulnerable regions); and the talent pipeline for test and packaging engineers?
  • As foundries and integrated device manufacturers (IDMs) deploy advanced technologies like chip-on-wafer-on-substrate and hybrid bonding to bring HBM closer to compute, could traditional OSAT models get commoditized?
  • To what extent should there be investments in next-generation interconnects such as CPO, LPO, photonics, and chiplet-based networking, and where and how can AI be applied to accelerate design cycles for these complex heterogeneous systems?

The rise of creative AI investments and deals that enable vertical integration

Strategic alliances among the broader AI, semiconductor, and cloud infrastructure providers have heralded a new AI computing capital cycle. Investments made in 2025 will likely continue or accelerate in 2026, creating a funding and demand ecosystem where capital and computing resources flow back and forth among companies mainly involved in AI model development, AI accelerator design, production, packaging, and data center infrastructure.33 For instance, an investing company (typically a chip hardware, platform, or cloud infrastructure provider) may invest billions of dollars in an AI startup to accelerate the development of solutions. In return, the AI startup would rapidly incubate and accelerate new product development and, in turn, buy the investing company’s computing resources and infrastructure offerings. These moves have become a way for chip companies to achieve vertical integration across the AI data center stack.

Besides AI training and inference workloads,34 another factor driving the surge in the semiconductor industry’s investment activity is the geopolitical imperative, even as governments and businesses want to influence regional technology infrastructure.35 Many governments consider AI models, chip design intellectual property, and leading AI accelerators to be critical to national security, supply chain resilience, and tech sovereignty.36 Increasingly, governments are seeking to secure these capabilities through export control measures to help bolster local and regional availability of leading-edge AI chip manufacturing capabilities,37 so that homegrown chipmakers can expand their market presence. Concurrently, they’re seeking to find a balance between limiting the export of strategic AI and technology products by allowing some of the advanced chips to be exported. For example, the US government in December 2025 approved NVIDIA to sell H200 AI chips to a set of approved customers in China, in return for a 25% share in NVIDIA’s chip sales.38 In the midst of these developments, Europe appears caught between US export controls (restricting advanced chip sales to China) and China’s countermeasures.

As tech and chip majors continue to pursue this new form of vertical integration (referred to as circular financing by some industry analysts), the semiconductor industry’s capital allocation strategies may need to shift from capacity- to capability-driven models, with an emphasis on achieving AI system-level differentiation. In 2026 and beyond, chip companies should not only consider expanding the breadth and scope of their operations by establishing more AI fabs or developing new AI chip platforms, but also foster strategic partnerships and make direct investments to build an ecosystem around their fab or chip platforms.

Traditional volume-based foundries may want to integrate advanced packaging capabilities. OSATs could codesign chiplets with integrated device manufacturers and design players, while electronic design automation companies and foundries could benefit from collaborating closely with wafer-fab front-end equipment providers. As chip industry executives look for ways to deploy their cash strategically, they should consider assessing talent needs and skill availability, core competencies, and partner models that are more region- or country-specific. This assessment should also include non-AI market opportunities by focusing on mature chip nodes to address automotive and electric vehicle, aerospace and defense, manufacturing, and power infrastructure markets—many of which could be specific to the countries in which they operate.

Strategic questions to consider

  • With billions of dollars already flowing into AI computing and data center infrastructure capacity expansion, how can capital be deployed not only to build more capacity but also toward scaling power generation (including carbon-free sources) to support that expansion?
  • When deploying capital, have organizations evaluated various factors, including geopolitical trade-related risks and policy shifts such as import duties, export controls, and localization moves; supply chain concentration risks such as substrates, chemicals and gases, and other materials and components; supplier and partner models such as multi-foundry or multi-cloud vendor partnerships; and talent availability?
  • How to balance investments between leading-edge logic and memory manufacturing and packaging and the continued need for trailing-node fabrication, equipment, and assembly and test?

Signposts for the future

For 2026, semiconductor industry executives should be mindful of the following signposts:

  1. Current leaders in AI GPUs, CPUs, and memory may find it challenging to maintain their dominant market share in the face of new entrants and the shift from AI training to inference. One view is that a growing pie is big enough for everyone, while others will be looking for signs that it’s more of a zero-sum game.
  2. DRAM capex is expected to rise 14% and NAND flash capex by 5% to US$61 billion and US$21 billion, respectively.39 Given the end-of-year surge in prices, those numbers could spike higher, meeting near-term demand, but possibly building overcapacity in the industry, yet again.
  3. Increasing volume and value of deals that involve complex revenue-sharing agreements or compute-for-equity swaps could exert pressure on future profitability and ROI for AI model developers and data center infrastructure players, as tech debt mounts further.
  4. As North America, Europe, the Middle East, and Japan intend to ramp up their own domestic chip production capabilities, foreign direct investment into the rest of Asia may be affected.
  5. As a corollary, regions may diverge more: For instance, Southeast Asia and India will likely emerge as volume-based back-end assembly and test hubs, specializing in select areas of the back-end processes. While Taiwan, the United States, Japan, and parts of Europe may focus on heterogeneous integration and advanced packaging, though, at varying levels of specialization.
  6. As overall AI data center buildouts continue to expand, it may further constrain electric power grids.40 Therefore, cloud and semiconductor companies that will likely stand to benefit will be those that proactively invested in or factored power generation capacity and availability, while those that didn’t consider power as part of the overall equation could experience execution challenges.

Continue the conversation

Meet the industry leaders

Jeroen Kusters

Principal | US Semiconductor Leader

Deb Bhattacharjee

Principal | Global Semiconductor Center of Excellence leader | Deloitte Consulting LLP

Duncan Stewart

Research director

Jeff Loucks

Tech, Media & Telecom | Executive director

Jan Thomas Nicholas

Executive Director, Technology & Transformation, Deloitte Southeast Asia

Jordan Bish

Partner Netherlands

by

Jeroen Kusters

United States

Deb Bhattacharjee

United States

Jordan Bish

Netherlands

Endnotes

  1. World Semiconductor Trade Statistics (WSTS), “Global semiconductor market approaches USD 1 trillion in 2026,” Dec. 2, 2025.

  2. A (spring 2026) Deloitte study on the AI chip market initially estimated that AI chips in 2026 would be about US$300B. Given the December 2025 upward revision of US$175B in the global chip market by the World Semiconductor Trade Statistics (all of which was driven by AI demand, with weakness in non-AI markets), Deloitte now estimates that the AI chip market in 2026 will be about US$500B.

  3. Deloitte analysis based on quarterly earnings call discussions, presentations, and insights shared by major semiconductor companies, cross-validated with secondary research data from third-party industry sources such as WSTS and International Data Corporation (IDC).

  4. Deloitte analysis of public market capitalizations in 2025, 2024, and 2023. The 2025 market cap was last updated based on the Dec. 15, 2025 market close.

  5. See note 2.

  6. ET Manufacturing, “AMD sees AI chip market exceeding $500 billion by 2028,” June 13, 2025; Max A. Cherney, “AMD expects profit to triple by 2030, data center chip market to grow to $1 trillion,” Reuters, Nov. 12, 2025.

  7. Deloitte analysis based on: Semiconductor Industry Association (SIA), “State of the US semiconductor industry 2025,” July 2025; Based on our discussions with SIA, 2025 YTD unit sales (as of Dec. 16, 2025) were 10% higher than 2024, and SIA estimates 1.05 trillion units of total chip sales for the full year 2025. Using WSTS’s estimates of US$772 billion worth of total semiconductor industry revenue for 2025, the average selling price per unit comes to US$0.74.

  8. A leading AI chip manufacturer has secured about 800,000 wafers for its main chip in 2026 and produces about 20 chips per wafer, suggesting approximately 16 million chips in total. With roughly an 80% market share, we estimate the annual total number of AI chips produced could be about 20 million.

  9. SEMI, “SEMI reports global silicon wafer shipments to rebound 5.4% in 2025, with new record expected by 2028,” press release, Oct. 28, 2025.

  10. The Straits Times, “Smartphone, PC sales expected to drop on higher prices due to surging memory chip costs,” Jan. 22, 2026; Francisco Jeronimo, “Global memory shortage crisis: Market analysis and the potential impact on the smartphone and PC markets in 2026,” IDC, Dec. 18, 2025; Dan Robinson, “Budget smartphones will be hit hardest as memory prices rise,” The Register, Jan. 15, 2026.

  11. TrendForce, “Rising memory prices weigh on consumer markets; 2026 smartphone and notebook outlook revised downward, says TrendForce,” press release, Nov. 17, 2025.

  12. WSTS, “Global semiconductor market show continued growth in Q2 2025,” press release, Aug. 4, 2025.

  13. Luke James, “AI data centers are swallowing the world’s memory and storage supply, setting the stage for a pricing apocalypse that could last a decade,” Tom’s Hardware, Oct. 3, 2025.

  14. PCPartPicker, “Memory price trends,” accessed Dec. 18, 2025.

  15. James, “AI data centers are swallowing the world’s memory and storage supply, setting the stage for a pricing apocalypse that could last a decade.”

  16. Counterpoint, “Memory prices soar by 50% in Q4, rally to continue in 2026,” Jan. 7, 2026.

  17. See note 2.

  18. Goldman Sachs, “How AI is transforming data centers and ramping up power demand,” Aug. 29, 2025.

  19. Luke James, “Jet engine shortages threaten AI data center expansion as wait times stretch into 2030—the rush to power AI buildout continues,” Tom’s Hardware, Oct. 27, 2025.

  20. Nestor Maslej et al., “The AI index 2025 annual report,” Stanford University Human-Centered AI, April 2025.

  21. Based on Deloitte’s analysis of articles from multiple third-party sources, including Semiconductor Engineering, The Wall Street Journal, Forbes, and Seeking Alpha, published during the second half of 2025 and the first quarter of 2026.

  22. Josh You and David Owen, “How much power will frontier AI training demand in 2030?” Epoch AI, Aug. 11, 2025.

  23. Duncan Stewart, Karthik Ramachandran, Prashant Raman, and Ariane Bucaille, “Rising trends: The new and next technologies worth putting on your radar,” Deloitte Insights, Nov. 19, 2024.

  24. Deloitte Consulting LLP performed an analysis of the AI data center’s rack infrastructure, including a rough bill of materials for the various semiconductor-based components and their underlying supply chain dynamics. This analysis is due to be published in the first quarter of 2026.

  25. In co-packaged optics, optical transceivers are placed directly next to (or within) the same package as the processor or application-specific integrated circuit using ultra-short die-to-die interconnects, tightening integration and, in turn, shortening the signal path between the processor and the optical interface. This helps reduce signal and electrical loss and lowers power usage. To read further, see: Sharada Yeluri, “Co-packaged optics–a deep dive,” APNIC, May 7, 2025.

  26. Oscoo, “HBF: A high-bandwidth flash new star breaking the “memory wall” for AI,” Nov. 3, 2025.

  27. Link-PP, “AI data centers drive optical transceiver market growth,” Dec. 10, 2025.

  28. Deloitte analysis based on data from, “Gartner, Forecast Analysis: AI Network Fabric, Worldwide, 2H25,” October 31, 2025. Gartner is a registered trademark of Gartner, Inc. and/or its affiliates and is used herein with permission. All rights reserved.

  29. Yeluri, “Co-packaged optics–a deep dive.”

  30. Based on Deloitte analysis of AI data center chip offerings and solutions from merchant silicon players, fabless design companies, and hyperscalers.

  31. From our conversations with semiconductor industry subject matter specialists, we find that there are limited back-end chip manufacturing-related efforts in Europe, likely a function of having more fabless companies in the region. And that may affect the region’s ability to bolster its end-to-end chip manufacturing capabilities across front end and back end. To read more about talent challenges in the semiconductor industry, see: The Chronicle Journal, “The looming silicon ceiling: Semiconductor talent shortage threatens global AI ambitions,” Dec. 12, 2025.

  32. Jeroen Kusters, Deb Bhattacharjee, Jordan Bish, Jan Thomas Nicholas, Duncan Stewart, and Karthik Ramachandran, 2025 Global Semiconductor Industry Outlook, Deloitte Insights, Feb. 4, 2025.

  33. Based on Deloitte’s analysis of multiple strategic corporate investments that were made by companies across the semiconductor and tech industry, including hyperscalers, AI model companies, semiconductor fabless companies and integrated device manufacturers, and private equity firms during 2025.

  34. Duncan Stewart, Jeroen Kusters, Deb Bhattacharjee, Arpan Tiwari, Girija Krishnamurthy, and Karthik Ramachandran, “TMT Predictions 2026: Why AI’s next phase will likely demand more computational power, not less,” Deloitte Insights, Nov. 18, 2025.

  35. David Jarvis, Duncan Stewart, Nick Seeber, Gillian Crossan, Tim Bottke, and Girija Krishnamurthy, “TMT Predictions 2026: A new era of self-reliance: Navigating technology sovereignty,” Deloitte Insights, Nov. 18, 2025; Additionally, Deloitte analyzed multiple state-level strategic investments across the United States, the Middle East (especially in the Gulf region), Japan, and Europe, and found that various governments are funding their respective countries’ or region’s AI and data center plans and supporting local fabrication, assembly and test, advanced packaging, and research and development.

  36. Karthik Ramachandran, Duncan Stewart, Jeroen Kusters, Deb Bhattacharjee, Girija Krishnamurthy, and Jan Nicholas, “TMT Predictions 2026: New technologies and familiar challenges could make semiconductor supply chains more fragile,” Deloitte Insights, Nov. 18, 2025.

  37. Ibid.

  38. Reuters, “Exclusive: Nvidia considers increasing H200 chip output due to robust China demand, sources say,” Dec. 15, 2025.

  39. TrendForce, “Memory industry to maintain cautious capex in 2026, with limited impact on bit supply growth, says TrendForce,” press release, Nov. 13, 2025.

  40. By 2035, Deloitte estimates that power demand from AI data centers in the United States could grow more than thirtyfold, reaching 123 gigawatts, up from 4 gigawatts in 2024. AI data centers can require far more energy per square foot than traditional data centers. To read further, see: Martin Stansbury, Kelly Marchese, Kate Hardin, and Carolyn Amon, “Can US infrastructure keep up with the AI economy?Deloitte Insights, June 24, 2025.

Acknowledgments

The authors wish to thank the following Deloitte colleagues for sharing their insights and perspectives: Brandon Kulik, Dan HamlingNina ZhangMike Luk, Steve FinebergJeff Loucks, Girija Krishnamurthy, and Karan Aggarwal​​.

Additionally, they would like to thank Shannon Rothacher, Kristen Tatro, and Alison Zink for marketing and PR support, and Michelle Dollinger for helping with project management. 

Thanks also to Tobias Pröttel of World Semiconductor Trade Statistics and Greg LaRocca of Semiconductor Industry Association for their support in helping us leverage their market data and validating our analysis.

Thanks are also due to the Deloitte Insights team for their support to help us publish this report.

Editorial (including production and copyediting): Andy Bayiates, Prodyut Borah, and Anu Augustine

Design: Molly Piersol

Cover artist: Rahul Bodiga and Jaime Austin

Knowledge services: Rohan Singh

Copyright