๐ŸŽง Listen to this article

Narrated by Talon ยท The Noble House

Global AI spending will total $2.5 trillion in 2026, according to Gartner's January 15, 2026 press release. To contextualize the scale: the Apollo program cost approximately $200 billion in 2024 dollars. The Interstate Highway System cost roughly $500 billion. AI infrastructure spending in a single year is five times Apollo and five times the highway system combined.

The ground-level data is more alarming than the headline number. JLL's North America Data Center Report documented available data center capacity at approximately 1% of total supply. With hundreds of billions flowing into new facilities, demand is absorbing supply as fast as it can be built. JLL described this as "sustained structural demand rather than cyclical imbalance" โ€” not a bubble but a waiting list.

๐ŸŽ™๏ธ Listen: Audio version

The Physical Constraint Nobody Votes On

Money can solve most infrastructure problems. Power is the exception. Fortune's Sharon Goldman documented that data centers already account for approximately 4% of U.S. electricity consumption. Goldman Sachs projects this reaching 8% by 2030, per their 2024 AI power demand analysis. The International Energy Agency projects global data center electricity demand could double by 2026 from 2022 levels.

The numbers from JLL's report make the constraint concrete: 92% of data center capacity currently under development is already pre-leased. The hyperscalers โ€” Amazon, Google, Microsoft, Meta โ€” are absorbing construction output before it comes online. Independent operators are on waiting lists. Startups attempting to build AI infrastructure without hyperscaler relationships can't get into the queue.

The response from the largest players has been to build their own power supply. Microsoft, Google, and Amazon have all announced deals for on-site natural gas plants, small modular nuclear reactors, and power purchase agreements directly with utilities. Google reactivated Three Mile Island to power its AI infrastructure โ€” a decommissioned nuclear plant brought back online specifically to feed data centers. The implication: the power grid as it existed was designed for the electrical load of 2015, not the electrical load of 2030.

AI data center power consumption electricity grid strain 4 percent visualization global demand surge
Data center electricity consumption: ~4% of U.S. grid in 2026, projected 8% by 2030 (Goldman Sachs). 92% of capacity under development already pre-leased (JLL North America Data Center Report). The constraint is physical, not financial: power grid capacity lags investment capital.

The Elon Musk / Eric Schmidt Bet on Orbital Compute

The 2015 vision of data centers in orbit โ€” dismissed at the time as novelty โ€” now comes up in serious infrastructure conversations. Elon Musk has stated publicly that more AI computing capacity could be in orbit than on Earth within five years. Former Google CEO Eric Schmidt and Alphabet CEO Sundar Pichai have both discussed orbital infrastructure in non-dismissive terms. The underlying logic: orbital data centers could use solar power continuously, require no terrestrial land, and bypass the permitting and utility coordination that makes ground-based data center construction a 24โ€“36 month process rather than a 6-month one.

The engineering obstacles are real: heat dissipation in vacuum, launch costs, latency (orbital mechanics create unavoidable 250โ€“600ms round-trip delays to low-Earth-orbit), and maintenance access. These are not trivial. But the fact that orbital compute is in serious conversations about AI infrastructure supply is itself diagnostic of how severe the terrestrial constraints have become.

The Gartner Number Unpacked

The $2.5 trillion figure includes AI-optimized server spending (growing 49% in 2026 to represent 17% of total AI spending), AI infrastructure construction adding $401 billion per Computerworld's January 19 coverage, AI software, and AI services. The full Gartner category is broad. But even narrowed to hardware and infrastructure alone, the number represents the fastest capital concentration into a single technology category in history.

Gartner separately reported that worldwide IT spending will total $6.15 trillion in 2026, with server spending accelerating 36.9% year-over-year. AI infrastructure spending isn't a portion of IT spending โ€” it's pulling IT spending up with it.

Gartner $2.5 trillion AI infrastructure spending 2026 scale visualization Apollo highway system comparison
$2.5 trillion in AI spending in 2026 (Gartner January 2026): the fastest capital concentration into a single technology category in history. Apollo: ~$200B in 2024 dollars. Interstate Highway System: ~$500B in 2024 dollars. One year of AI infrastructure: roughly the equivalent of five Apollo programs.

The Constraint That Matters for Everyone Not Named Amazon

The supply crunch in AI compute is not uniformly distributed. Hyperscalers with long-term power purchase agreements and pre-negotiated data center capacity have supply. Everyone else is competing for a constrained remainder. The practical consequence: AI startups and mid-size companies face structural capacity constraints that large incumbents don't.

This is a different version of the infrastructure commoditization argument. The hyperscalers aren't just building infrastructure for their own AI products โ€” they're the only entities that can build enough of it to have predictable access. If you're not Amazon, Google, Microsoft, or Meta, your AI infrastructure strategy runs through one of them. The compute is available; the access is on their terms.


Sources: Gartner, "Worldwide AI Spending Will Total $2.5 Trillion in 2026," January 15, 2026; Gartner, worldwide IT spending forecast $6.15 trillion, February 3, 2026 (Computerworld coverage); JLL North America Data Center Report (current period); Fortune, Sharon Goldman, data center power consumption reporting; Goldman Sachs AI power demand analysis, 2024; IEA Global Energy Outlook; Al Jazeera, AI spending scale comparison to Apollo program


Sources