
Anthropic will spend hundreds of billions of dollars on Google's chips and cloud services in a push to secure critical computing resources as surging demand for the company's tools pushes its annualised revenue to $30bn.
The AI lab said on Monday it has committed to use "multiple gigawatts" of capacity from Google's TPU, a rival chip to Nvidia's dominant GPU, and the search giant's cloud services.
Around 3.5GW of capacity on Google's hardware will come through a partnership with chipmaker Broadcom, starting from next year, according to a separate filing on Monday.
In all, the deal would give Anthropic access to close to 5GW in new computing capacity over the coming years, according to a person with knowledge of the terms.
The hardware and infrastructure required to develop a single gigawatt of capacity -- roughly equivalent to the power output of a nuclear reactor -- is estimated to cost from $35bn-50bn, with the bulk of that spent on chips. That suggests the lossmaking start-up's commitment could run to hundreds of billions of dollars.
Anthropic executives are racing to secure enormous supplies of computing power in order to meet rapidly growing demand for the company's tools, particularly coding agent Claude Code, and to fund costly model training.
The San Francisco-based group's annualised revenue has shot from $9bn at the end of last year to $30bn at the end of March, Anthropic said on Monday. The figure represents its revenues from the past 28 days extrapolated over a year.
"We are building the capacity necessary to serve the exponential growth we have seen in our customer base while also enabling Claude to define the frontier of AI development," said Krishna Rao, Anthropic's chief financial officer.
Broadcom shares rose almost 3 per cent after the market closed on Monday. The company also announced that it would develop and supply custom TPUs for Google as part of a long-term agreement through 2031.
Google is seeking to expand sales of its in-house chips, which have helped power its own Gemini AI models, bringing it into increasingly direct competition with Nvidia, the world's largest semiconductor group.
Anthropic's rival OpenAI last year struck a string of computing deals with Broadcom, Nvidia, AMD and others, in a push to lock in as much capacity as possible to power its own AI tools.
The deals have been criticised for their circularity, with Big Tech groups acting as customers, suppliers and investors in the AI labs. Google has invested billions into Anthropic, giving it a 14 per cent stake as of March last year, according to a legal filing.
Both companies have faced scrutiny for their heavy outlay, having repeatedly returned to venture capital and sovereign wealth backers to raise tens of billions of dollars on the promise that they can become profitable if they build sufficient scale and market dominance. Anthropic raised $30bn in February in a deal valuing it at $380bn, including the new money.
In its filing on Monday, Broadcom said the deal was "dependent on Anthropic's continued commercial success" and that the parties "are in discussions with certain operational and financial partners".
Monday's deal expands on a partnership Anthropic announced with Google last October, which it said at the time was "worth tens of billions of dollars and is expected to bring well over a gigawatt of capacity online in 2026".
In November, Anthropic also committed to spend $50bn on new data centres in Texas and New York with cloud computing group Fluidstack, and agreed to purchase $30bn of additional capacity from Microsoft and Nvidia.