Back
CNBC / SiliconAngleBusinessCNBC / SiliconAngle2026-04-07

Anthropic Hits $30B Revenue Run Rate as Broadcom Confirms Expanded TPU Deal Through 2031

Anthropic has crossed a $30B annual revenue run rate — up from $9B at year-end 2025 — while Broadcom confirmed a new multi-year deal to supply Google TPUs to Anthropic through 2031, covering roughly 3.5 GW of compute capacity.

Original source

Anthropic's revenue trajectory has accelerated sharply into 2026. The company disclosed a $30B annual run rate this week, up more than 3x from the $9B figure reported at the end of 2025. Over 1,000 enterprise customers now spend at least $1M per year on Claude API access, and the company's headcount has grown to roughly 4,200 employees following a series of acquisitions.

Simultaneously, Broadcom confirmed an expansion of the compute infrastructure underpinning Anthropic's growth. The chipmaker announced a long-term agreement to supply custom Google TPU v6 chips to Anthropic beginning in 2027, with capacity scaling to approximately 3.5 gigawatts through 2031. This is a meaningful expansion beyond the earlier Google cloud compute partnership disclosed in February, which covered roughly $5B in cloud credits.

The dual announcement — revenue milestone plus hardware commitment — signals a maturation in how Anthropic is structuring its growth. Rather than relying on spot cloud purchasing, the company is locking in multi-year compute contracts as it scales model training for its next generation of Claude models, reportedly codenamed Mythos and targeting release in Q3 2026.

For investors, the $30B run rate puts Anthropic within striking distance of OpenAI's reported $40B figure, and ahead of Google's internally disclosed AI revenue (approximately $18B annually from Gemini API and Workspace AI features). The Broadcom deal also provides cost predictability — a significant advantage as training runs for frontier models now regularly exceed $100M.

The broader implication: the compute arms race among frontier labs is shifting from who has the most cloud credits to who has locked in the most dedicated silicon. Anthropic's 3.5 GW commitment, combined with Amazon's $4B investment and Google's $3B, means the company has more confirmed compute runway than any point in its history.

Panel Takes

The Builder

The Builder

Developer Perspective

The 1,000+ customers at $1M+/year figure is the most interesting number here. That's a mature enterprise sales motion at a company that was still mostly researcher-focused two years ago. The API has genuinely become infrastructure for a lot of serious businesses.

The Skeptic

The Skeptic

Reality Check

A $30B run rate sounds impressive until you remember the company is spending significantly more than that on compute, salaries, and infra. The profitability question never gets asked in these announcements. Locking in 3.5 GW of TPU capacity is a massive capital commitment with a lot of execution risk.

The Futurist

The Futurist

Big Picture

The Broadcom deal is the real story. When AI companies are signing 5-year silicon contracts measured in gigawatts, the compute layer is becoming a strategic moat, not a commodity input. This is the kind of infrastructure commitment that defines which labs can even stay in the frontier race through 2030.