Skip to main content
← Back to thoughtsInfrastructure

Anthropic Bought 300 MW From Musk. AI Stopped Being a Software Business.

Anthropic licensed every megawatt it could find. $200B Google, 5 GW AWS, plus Microsoft, SpaceX, Fluidstack. The bottleneck moved to electricity.

··6 min read

Anthropic Bought 300 MW From Musk. AI Stopped Being a Software Business.

TL;DR: Anthropic does not have a cloud provider. It has an electricity portfolio. In 24 hours on May 5 and 6, 2026, it signed two of the largest compute commitments ever disclosed: a reported $200 billion spend with Google Cloud and the full capacity of Elon Musk's Colossus 1: 300 megawatts, 220,000 NVIDIA GPUs. Stack AWS, Microsoft, and Fluidstack on top. AI labs stopped competing on models a while ago. They are competing on substations now.


Why is AI compute now a power business, not a software business?

Ask what is scarce.

Models are not. DeepSeek R1 replicated frontier capability cheaply in early 2025.  [1] Llama, Qwen, Mistral, Gemma, GLM are public. Distillation works. The gap between the best closed model and a competent open one keeps shrinking by the quarter. Models replicate. That is what software does.

Megawatts are. A new hyperscale data center needs land with a fiber path, a substation with spare capacity, cooling water, and a utility willing to sign a 15-year power purchase agreement. The constraint is the grid, not silicon. TSMC can ship more wafers. ERCOT and TVA, the regional power authorities running Texas and the southeast, cannot conjure more turbines.

If models commoditize and electricity does not, the moat shifts to whoever controls the megawatts. Anthropic does not have a cloud provider. It has an electricity portfolio.

Why did Anthropic sign two giant compute deals in 24 hours?

The evidence arrived in two press cycles.

Tuesday May 5: The Information reported Anthropic had committed to spend $200 billion with Google Cloud over five years for TPU and cloud capacity.  [2] That sits on top of Alphabet's $40 billion equity investment from April 24, priced at a $350 billion valuation.  [3]

Wednesday May 6: Claude announced the SpaceX partnership.

The number under that post is the one to remember. Over 300 megawatts. 220,000 NVIDIA GPUs. Inside a month.  [4] That is a full data center, Colossus 1 in Memphis, switched from serving Grok to serving Claude.  [5]

Two deals, 24 hours, two of the largest disclosed compute commitments on the planet. Read together, the shape of the market is the story.

How big is Anthropic's compute portfolio in May 2026?

Stack the disclosed deals up. They look like an electric utility's customer list, not a software company's vendor list.

PartnerDisclosed commitmentWhat it buysWhen
Google Cloud$200B over 5 years, multiple GW (reported), $40B equity at $350B valuationTPU 8 training and inference  [3]  [6]  [2]$40B equity Apr 24, 2026; $200B spend May 5, 2026
AWS5 GWTrainium-class capacity  [4]Anchor partnership 2024–2025
Microsoft + NVIDIA$30 billion of Azure capacityNVIDIA GPU clusters  [4]Late 2025
SpaceX (Colossus 1)300 MW / 220,000 GPUsNVIDIA GPUs in Memphis  [4]  [5]May 6, 2026
Fluidstack$50 billion of domestic infrastructureMixed  [4]2025–2026

Horizontal bar chart of Anthropic's disclosed compute by partner in May 2026, with the multi-gigawatt Google Cloud and AWS Trainium commitments highlighted in terracotta, Microsoft and Fluidstack as dollar-anchored bars, and SpaceX Colossus 1 at 300 megawatts

Combined disclosed footprint sits in the 10 GW range. Outside OpenAI's 30 GW 2030 target, no other lab is on this scale.  [7] Anthropic owns none of the silicon, data centers, or substations. It rents all three from companies that also ship competing models. That is the strategy.

Why did Musk go from suing AI labs to renting them GPUs?

A year ago Musk was suing OpenAI and using "Misanthropic" as a punchline. This week he posted that he had spent time with Anthropic's senior team, "was impressed," and that "no one set off my evil detector."  [8]

The pivot reads cleaner as a balance sheet event. SpaceX absorbed xAI in February at a $1.25 trillion combined valuation  [9] and is targeting a June 2026 Nasdaq debut at a reported $1.75 trillion.  [10] A 300 MW data center burning capex with no anchor tenant is a write-down on the eve of an IPO. Anthropic's signature turns idle silicon into committed revenue. The "evil detector" went silent because the depreciation schedule was loud.

What happens when your AI competitor is also your landlord?

The same companies show up as Anthropic's investor, supplier, and rival.

  • Google invests $40B, sells multiple GW of TPU (reported), ships Gemini against Claude on the same chips.  [3]  [6]
  • Amazon anchors Anthropic with multi-billion-dollar funding, sells 5 GW of Trainium, ships Nova from Bedrock.
  • SpaceX sells 300 MW of Colossus, ships Grok from the same campus.

Three overlapping circles labeled Investor, Supplier, and Rival, with Google, AWS, and SpaceX placed in the center triple-overlap region

Boeing does not buy engines from Airbus. Goldman does not rent trading floors from Morgan Stanley. AI does because idle capex loses money faster than competitive concerns lose sleep. The labs that compete on models cooperate on watts. The cap table is a competitive map. The colo is a peace treaty.

Who wins when AI is bottlenecked by electricity?

Sort the field by power, not parameters, and the winners line up differently.

Hyperscalers with substation rights win. Google, AWS, and Microsoft sit on years of permitting, signed PPAs, and water agreements. Every megawatt they bring online prints rent twice: from their own training and from the labs renting next door.

Anthropic wins by being multi-homed. OpenAI is anchored to Microsoft and the Stargate consortium: one host, one renegotiation, one pricing power.  [11] Anthropic buys from Google, AWS, Microsoft, SpaceX, and Fluidstack: five hosts, five negotiations, five arbitrage axes. The $350 billion valuation is for the portfolio, not the model lead.

Single-cloud labs lose. OpenAI's $122 billion fundraise pours into the same megawatt chase, mostly inside Microsoft.  [11] Works until the host raises rent, ships a competing product, or suffers an outage. Hedging is what mature industries do.

What should a builder do differently after these deals?

If you ship product on top of model APIs, the next two years of unit economics live in three moves.

Treat model providers like fuel suppliers. Airlines hedge fuel. Inference cost will behave the same way once margin pressure shows up. Pick at least two providers per critical path so a price move on one does not break your P&L.

Watch which labs sign multi-source power deals. Anthropic buys from five. OpenAI buys mostly from Microsoft. Multi-source means pricing power scales with the providers' competition. Single-source means you ride the host's margin.

Track tokens-per-watt, not parameters. Quality is a flat plane across the top three labs now. The differentiator is the cost of serving a token, and that cost is set by power. Read data center announcements before model release notes.

I wrote earlier that AI was splitting into two economies, training and inference. Both run on electricity. Anthropic's announcement slipped in a line about "multiple gigawatts" of orbital compute.  [4] When the bottleneck is the grid, the next move is power off the grid.


Key takeaways

  • Models replicate, electricity does not. The moat moved from architecture to substations.
  • Anthropic now operates in the 10 GW range of disclosed compute across Google, AWS, Microsoft, SpaceX, and Fluidstack.  [4]
  • Two deals in 24 hours. $200 billion with Google Cloud (May 5) and 300 MW of Colossus 1 (May 6).  [4]  [2]
  • Anthropic's $350B valuation is for the portfolio, not the model. OpenAI is single-homed in Microsoft. Anthropic has five negotiations.
  • Musk's pivot is a balance sheet event. SpaceX needs Colossus booked before its June 2026 IPO at a reported $1.75 trillion valuation.  [10]
  • Builders should hedge providers like fuel and track tokens-per-watt, not parameters.

Frequently asked questions

Why is AI now a power business?

Models replicate, electricity does not. Open weights and distillation have collapsed the cost of catching up to a frontier model. The thing that has not collapsed is the substation, the cooling water, or the construction permit. The lab that secures megawatts ships product. The lab that does not, throttles.

What did Anthropic buy from SpaceX?

The full compute capacity of Colossus 1, SpaceX's data center in Memphis. Over 300 megawatts of power and over 220,000 NVIDIA GPUs, available within a month.  [4] Anthropic uses it to raise Claude Code rate limits and lift peak-hour caps for Pro and Max subscribers.

How much compute does Anthropic now have access to?

In the 10 GW range of disclosed capacity: multiple GW from Google Cloud TPU (reported, with capacity coming online from 2027), 5 GW from AWS Trainium, plus 300 MW from SpaceX, $30 billion of Azure capacity from Microsoft and NVIDIA, and a $50 billion Fluidstack deal.  [4] Outside OpenAI's stated 30 GW 2030 target, no other lab is on this scale.

How does Anthropic's compute strategy compare to OpenAI's?

OpenAI is anchored to Microsoft and the Stargate consortium: one host, one renegotiation, one pricing power. Anthropic buys from Google, AWS, Microsoft, SpaceX, and Fluidstack: five hosts, five negotiations, five arbitrage axes. The $350 billion valuation is for the portfolio, not the model lead.

What does this mean for AI builders?

Treat model providers like fuel suppliers. Pick at least two per critical path. Watch which labs sign multi-source power deals; their pricing power scales with their providers' competition. Track tokens-per-watt, not parameters.


I break down things like this on LinkedIn, X, and Instagram. Usually shorter, sometimes as carousels. If this resonated, you would probably like those too.


Sources

Footnotes

  1. Earlier piece: TurboQuant and the inference cost shift []

  2. Engadget: Anthropic reportedly agrees to pay Google $200 billion for chips and cloud access (May 5, 2026) [] [ [2]] [ [3]]

  3. TechCrunch: Google to invest up to $40B in Anthropic in cash and compute [] [ [2]] [ [3]]

  4. Anthropic: Higher usage limits for Claude and a compute deal with SpaceX [] [ [2]] [ [3]] [ [4]] [ [5]] [ [6]] [ [7]] [ [8]] [ [9]] [ [10]]

  5. CNBC: Anthropic and SpaceX announce compute deal that includes space development [] [ [2]]

  6. Anthropic: Expanding our use of Google Cloud TPUs and Services [] [ [2]]

  7. I wrote earlier on how the AI silicon stack is moving from one chip to two economies []

  8. Yuchen Jin tweet quoting Elon Musk on Anthropic safety visit []

  9. Fortune: Elon Musk's SpaceX buys xAI in stunning deal valued at $1.25 trillion ahead of looming IPO []

  10. TechMarket Briefs: SpaceX IPO 2026 valuation analysis [] [ [2]]

  11. Earlier piece: What OpenAI's $122B raise means for AI infrastructure [] [ [2]]

The Simple Take

One email when something in AI or tech deserves more than a headline.

Not a digest. Not a roundup. The one idea that week, fully worked out.