
CoreWeave Inc. today announced that it has won a multiyear contract to supply Anthropic PBC with cloud infrastructure.
The company's shares closed 10.8% higher on the news.
The data center capacity commissioned by Anthropic developer will start coming online later this year. CoreWeave stated that the infrastructure will "support the development and deployment of Anthropic's Claude." That suggests Anthropic plans to run both model training and inference workloads on the platform.
CoreWeave didn't specify the dollar value of the deal. However, chief executive Michael Intrator did tell Bloomberg that Anthropic will use a "variety" of Nvidia Corp. chips deployed in U.S. data centers.
The most advanced Nvidia graphics card that the company currently offers is the Blackwell Ultra. It comprises two 4-nanometer dies linked together by a custom interconnect called the NV-HBI. The technology streams data between the modules at a rate of 10 terabits per second.
The Blackwell Ultra includes not only matrix multiplication units, the core building blocks of AI accelerators, but also more specialized circuits. There's so-called warp-synchronous memory that speeds up the task of sharing data between an AI model's components. Additionally, cores called Special Function Units speed up perform mathematical operations that involve transcendental numbers such as pi.
The disclosure of the Anthropic partnership comes a day after CoreWeave expanded an existing infrastructure deal with Meta Platforms Inc. The revised contract covers "initial deployments" of Vera Rubin, the successor to the Blackwell Ultra. If those deployments come online in the near future, it's possible Anthropic will also use the new chip to power its CoreWeave-hosted workloads.
The cloud provider operates more than 43 data centers with about 850 megawatts of capacity as of February. CoreWeave provides infrastructure to its largest customers through so-called Dedicated Access AZs. Those are clusters with a single-tenant design, which means that they're used by a single organization. Cloud providers usually run workloads on infrastructure that is shared by multiple customers.
CoreWeave has incorporated multiple custom elements into its data centers. According to the company, its facilities can predict spikes in workloads' hardware usage and optimize their cooling equipment accordingly. CoreWeave also reconfigures the associated power distribution hardware.
The company sells its AI infrastructure alongside other offerings. CoreWeave provides instances powered solely by central processing units that can be used to run general-purpose workloads. Additionally, it offers services that ease tasks such as fixing AI training errors.
According to CoreWeave, its cloud platform is used by 9 out of the world's 10 top AI model providers. That group includes not only Anthropic but also rival OpenAI Group PBC. Last year, the latter company agreed to rent $22.4 billion worth of infrastructure from CoreWeave.