In February 2026, four companies announced they would spend a combined $650 billion on AI infrastructure in a single year. Amazon committed $200 billion. Meta guided $115 to $135 billion. Alphabet said $175 to $185 billion. Microsoft was already burning $37.5 billion per quarter. The numbers are so large they've lost meaning. Try this: $650 billion is more than the GDP of Sweden.

Meanwhile, in Florida, a defense-tech founder named Aaron Sneed was running his entire company with 15 AI agents. No employees. His "council," built on ChatGPT and Nvidia tools, handled HR, legal, finance, supply chain, engineering, quality control, and communications. Business Insider ran the story on February 13th. Sneed estimated the setup saved him 20 hours a week. That's a conservative number for what is effectively a one-man operation with the functional capacity of a mid-size firm.

These two facts belong in the same paragraph because they tell the same story from opposite ends. The biggest companies on earth are building the most expensive infrastructure in history. And the primary beneficiaries will be people like Aaron Sneed.

The $650 billion isn't an investment. It's an involuntary subsidy.

The entity that builds the infrastructure rarely captures the value it creates

The pattern has a name in economics: infrastructure commoditization. The entity that builds the infrastructure rarely captures the value it creates.

AT&T spent decades building the American telephone network. By the 1960s, the Bell System connected 80 million phones across the country. But AT&T didn't capture most of the economic value those phones created. Insurance agents used them to sell policies. Pizza shops used them to take orders. Stockbrokers used them to execute trades. The telephone enabled trillions of dollars in economic activity. AT&T got a monthly phone bill.

The same pattern repeated with the internet. Telecom companies spent billions laying fiber and building out broadband. The value accrued to Google, Amazon, and Facebook, companies that built applications on top of infrastructure someone else paid for.

Now it's happening again, and the twist is that the same companies that captured value from the last infrastructure cycle are the ones building this one. Amazon, Google, Meta, and Microsoft are laying the digital equivalent of interstate highways. They believe they're building a moat. They're actually building a public good.

Open source closes the frontier gap within months, and no amount of spending can seal the leak

DeepSeek's R1 model, released under an MIT license, delivers reasoning performance comparable to OpenAI's o1 at $0.55 per million input tokens versus $15 for the equivalent proprietary model. That's a 27x cost difference. And it's open source, meaning anyone can run it on their own hardware for the cost of electricity.

This isn't an isolated case. The gap between open-source and proprietary models has been shrinking every quarter since Meta released the original LLaMA weights in early 2023. Each new generation of open models closes approximately six months of the frontier gap within weeks of release. DeepSeek V3.2 now offers reasoning at $0.42 per million output tokens, down from $2.19 just a year earlier.

The mechanism: frontier labs spend billions training a model. They publish benchmarks. Within months, open-source teams reverse-engineer the architecture insights, apply post-training techniques like distillation and reinforcement learning, and replicate 80 to 90 percent of the capability at a fraction of the cost. Small teams then fine-tune these open models for specific domains, often outperforming the general-purpose frontier model on the tasks that actually matter to their business. The billions spent on training become a public research subsidy. The moat fills with water, then evaporates.

$650B Misdirection - section illustration

Capex is accelerating while revenue per unit of compute collapses

Capital expenditure for the four hyperscalers rose from $245 billion in 2024 to $410 billion in 2025 to a projected $650 billion in 2026. That's a 165 percent increase in two years. During that same period, the cost of inference fell by roughly 90 percent. What cost $60 per million tokens with OpenAI's o1 at launch now costs $0.42 with DeepSeek's open-source equivalent.

These two trends are moving in opposite directions. Spending goes up. Revenue per unit of compute goes down. The hyperscalers are building massive capacity, but the price they can charge for that capacity is falling faster than they can fill it.

Amazon reported 24 percent revenue growth in cloud. Alphabet's cloud backlog grew 55 percent quarter-over-quarter to $240 billion. But the concerning detail is in Microsoft's disclosure: 45 percent of its $625 billion in future cloud contracts come from a single customer, OpenAI. Strip that out and the picture changes considerably.

The market noticed. Amazon, Google, and Microsoft collectively lost $900 billion in market cap after their spending announcements, according to the Financial Times.

The enterprise cloud argument is valid, but it assumes Fortune 500 companies build the most valuable AI applications

The strongest counterargument: open-source models are good, but they're not good enough for applications that generate the most revenue. Enterprise customers need reliability guarantees, compliance certifications, and integration with existing systems. Running inference at scale requires infrastructure that only the hyperscalers can provide. The cost of self-hosting at production quality, with redundancy, latency requirements, and security, is far higher than hobbyists on Reddit assume. Open source closes the capability gap on benchmarks, but benchmarks aren't production.

This is partially true. For a Fortune 500 company running AI across millions of customer interactions, the managed cloud offering from Amazon or Microsoft is worth the premium. Integration, compliance, and uptime matter more than per-token cost.

But that argument assumes the most valuable AI applications will be built by Fortune 500 companies. And that's exactly the assumption that history should make you question.

The telephone's most valuable applications weren't built by AT&T. The internet's most valuable applications weren't built by telecom companies. The most valuable AI applications probably won't be built by the companies selling the compute. They'll be built by people and small teams who use the infrastructure without bearing the cost of building it.

Aaron Sneed isn't paying $200 billion for AI infrastructure. He's paying a ChatGPT subscription and an Nvidia hardware bill. The ratio of value he extracts to what he pays is astronomical. Multiply that by millions of operators doing the same thing, and you start to see the misdirection.

Sam Walton built Walmart on highways the government paid for — the same pattern is repeating

In 1956, President Eisenhower signed the Federal Aid Highway Act, committing $25 billion (roughly $275 billion in today's dollars) to build 41,000 miles of interstate highway. The government built the roads. It didn't capture the economic value those roads created.

$650B Misdirection - pull quote illustration

Sam Walton did. Walmart's entire business model depended on the interstate system. Cheap, fast logistics across a national highway network let Walton build distribution centers in rural areas and undercut every competitor. The government subsidized the infrastructure. Walmart captured the margin.

The AI infrastructure buildout follows the same logic. The hyperscalers are spending $650 billion building compute capacity that anyone can access. The marginal cost of using that infrastructure is approaching zero for small operators, especially as open-source models eliminate the need to use proprietary APIs at all.

Every dollar the hyperscalers spend on infrastructure accelerates the collapse in your costs

For investors, the hyperscaler capex race looks like a prisoner's dilemma. No company can afford to stop spending because their competitors won't stop. But the return on each marginal dollar is declining because open source and falling inference costs compress margins. DA Davidson analyst Gil Luria described it as a "winner-take-all or winner-takes-most market." The more uncomfortable possibility: it's a race where every runner builds the track while running on it, and the spectators get to use the track for free afterward.

For builders and operators, this is the best time in history to start a company that relies on AI infrastructure. The biggest companies on earth are spending hundreds of billions to drive down the cost of the compute you need. Open-source models give you 80 to 90 percent of frontier capability at a fraction of the cost. The operational leverage available to a single person in 2026 exceeds what was available to a 50-person company in 2020.

For everyone watching from the sidelines, the cost of intelligence is collapsing. Not metaphorically. Literally. The price per unit of cognitive work, writing, analysis, research, code, design, is falling on a curve steeper than Moore's Law. Every dollar the hyperscalers spend on infrastructure accelerates this collapse. Their capex is your discount.

The action ladder starts with one bottleneck you can automate this week

\1 Identify one operational bottleneck that doesn't require your unique judgment. Email triage. Data entry. First-draft writing. Contract review. Pick the most time-consuming one.

\1 Deploy an AI agent to handle it. Not a chatbot you paste text into, but an agent with memory and access to your actual tools and data. The APIs exist. The open-source models exist. The only barrier is the decision to start.

\1 Connect your agents to each other. The research agent feeds the writing agent. The writing agent feeds the publishing agent. This is how you turn $650 billion in someone else's infrastructure spending into your competitive advantage.

The $650 billion is already being spent — are you going to drive on the roads or watch the cement trucks go by?

Forty years from now, business historians will write about the AI infrastructure buildout the way they write about the interstate highway system. They'll note that the companies that spent the most captured the least. They'll observe that the real value accrued to millions of small operators who used the infrastructure without paying for it.

The roads are being built. Are you going to drive on them?