Broadcom And Google Benefit Mightily From Anthropic's Meteoric Growth
Market Updates

Broadcom And Google Benefit Mightily From Anthropic's Meteoric Growth

The Next Platform19d ago

For so many years, the hyperscalers and cloud builders have dominated IT spending and much of the talk about system architecture. But the AI model builders, particularly Anthropic and OpenAI, are now their peers when it comes to massive infrastructure investments, and what they do - and do not do - also shapes the AI landscape.

To make those investments in AI infrastructure, the AI model builders have themselves had to raise tremendous amounts of money from investors - sometimes including their AI compute engine suppliers, which is a bit of roundtripping, indeed. Nvidia, AMD, Amazon Web Services, Microsoft Azure, and Google have all made investments in OpenAI and Anthropic, helping them prime the AI workload pump that will in turn lead to larger AI system sales in the future as more of us make use of AI applications.

Anthropic is growing like crazy right now, thanks to code assistant variants of its Claude model. Code assistants, as it turns out, are the killer app for GenAI, much to the chagrin of millions of programmers worldwide but not to the programming managers who have mastered the art of keeping hundreds of AI coding agents busy working 24/7 on software projects. There are plenty of complaints about the quality of the code that GenAI models generate, but perhaps AI model revenue streams are the leading indicator of enthusiasm for the idea despite the many shortcomings of automated programming.

Given its pole position in code assistants and its consequential and more explosive revenue growth, Anthropic's recent revenue curve is perhaps a better barometer than OpenAI. Brent Thrill and Blayne Curtis, equities analysts at Jeffries, have been tracking the annualized revenue run rate for these two AI model builders. Anthropic had around 500 customers paying it $1 million or more annually to license Claude back in February, and said this week that it has more than doubled that. Two years ago, the overall revenue trajectory at Anthropic - which includes Claude model resale through AWS Bedrock and Google Vertex at the gross revenue levels as well as any direct sales that Anthropic does - was lower and slower growing than the net revenue figures that OpenAI reported - which includes the net revenues OpenAI gets from Google, AWS, and Microsoft as well as direct API revenues. But early this year, OpenAI's revenues stumbled a bit (dropping from a $25 billion ARR to a $24 billion ARR, while Anthropic accelerated from a $14 billion ARR in February of this year to a $30 billion ARR this week.

Just for fun, we asked Claude to comb the Internet and build a relative ARR chart for OpenAI and Anthropic. Take a gander:

These figures mesh with the ones that the Jeffries folks have been tracking, which is the only reason we let Claude build the chart. It won't happen a lot here at The Next Platform. Remember, one is counting revenue at the customer level, and the other is counting revenue that it actually receives. IDC and Gartner have a similar split in the way they count things. IDC has always counted "factory revenue" - direct sales by vendors plus sales into the distribution channel, while Gartner counts the money that end users spend, which includes the channel overhead. We don't know what the overhead of the channel is for Anthropic, so it is tough to adjust it match apples to apples with OpenAI. What is clear is that Anthropic is taking off and maybe OpenAI is not, and that is not a good thing for OpenAI when it is trying to go public this year and it is a very good thing for Anthropic when it is also trying to go public this year.

Perhaps more concerning for Sam Altman & Co, OpenAI is overvalued compared to Anthropic based on these revenue rates. It is very hard to reckon cumulative funding for OpenAI, but various estimates put it at somewhere between $168 billion and $199 billion, with its last round coming in at $122 billion - with Nvidia, Amazon, and SoftBank kicking in most of that - and giving it a valuation of $852 billion.

But, Anthropic, which has caught the code assistant wave with Claude Code in a way that OpenAI Codex just has not, is driving revenue but does not have so many investors that want to cash out so big, and is therefore not under such pressure to have a history-making, record-breaking initial public offering. Depending on relative profitability - by which I mean the level of losses compared to revenues, because neither company has a hope in hell of being profitable until the next decade begins, and even that is not necessarily going to happen - Anthropic might deserve a better IPO than OpenAI, despite it only having raised around $67 billion in total and having a valuation of $380 billion.

Here's the important thing that everyone - Wall Street, Main Street, your cab driver, your mom - is watching. That revenue that OpenAI and Anthropic are chasing requires token chewing and token spitting. And the top brass at Anthropic are wracking their brains trying to figure out how to get the money to get the infrastructure to meet future token demand.

Hence, an expanded deal between Broadcom and Google and Anthropic, which Broadcom announced in an 8-K filing with the US Securities and Exchange Commission today, is going to get more iron into the hands of Google and Anthropic, which is a frenemy in that it competes with Google's Gemini model but also uses Google TPU infrastructure to train and infer Claude.

Two things are happening. First, Google has inked a long term agreement with Broadcom to help develop and make future TPUs and has a also signed a supply assurance agreement for networking and other components used in Google's own rackscale systems that runs through the end of 2031. As far as we know, MediaTek is also helping design and manufacture future TPUs for Google, but these appear to be variants of the mainstream compute engines - variations of the TPU v7, TPU v8, and TPU v9 devices is what the chatter is all about. It is understandable that Google wants to have a second source for its AI compute engines to mitigate risks.

Similarly, Anthropic has to mitigate risks by working with AWS on Trainiums, Google on TPUs, Nvidia on its GPUs, and perhaps soon, AMD on its GPUs. So Anthropic is renting TPU capacity on Google Cloud, but in 2027, it also plans to install its own TPU racks, built by Broadcom and authorized by Google, in its own datacenters. If Nvidia AI systems and their datacenters - the indisputable Cadillac of AI training and inference - cost on the order $50 billion per gigawatt (which is a number that Nvidia co-founder and chief executive officer Jensen Huang has used a number of times), it is reasonable to assume that TPU infrastructure might only cost on the order of $30 billion to $35 billion per gigawatt.

So boosting the capacity to 3.5 gigawatts will give Broadcom more dough, and depending on what Google is charging Broadcom to resell its TPUs to Anthropic, that incremental cost lowers Google internal TPU bill. In a sense, Google is turning Broadcom into an OEM for Anthropic and is booking revenue in its cloud hardware division, offsetting its own TPU costs. If Google is charging a 25 percent mark up, then if it can get enough customers to buy 4X the number of TPUs that it needs, it can get its TPUs for free, and charge a hefty profit to rent its own TPUs on the cloud. In the long run, the AI model builders will not rent huge amounts of capacity on the cloud. They will make the clouds sell them infrastructure on the cheap that they can amortize, thereby lowering the cost of tokens.

Google's other option was to lose Anthropic as a customer and have it go off and create its own AI XPUs or do a similar deal with AWS or Microsoft or Meta Platforms.

Now, Anthropic has to come up with the money to pay for this gear in 2027, which is why it is doing an IPO in 2026 and not installing this TPU gear right now. In the meantime, Anthropic will rent TPU capacity on Google Cloud and Trainium capacity on AWS, pay through the nose, and get by.

Originally published by The Next Platform

Read original source →
Anthropic