
Amazon and Anthropic have dramatically expanded their strategic partnership, moving beyond simple venture funding into a massive, decade-long infrastructure and capital commitment. In an announcement today, Amazon revealed a new $5 billion direct investment in the AI safety and research firm, accompanied by an unprecedented $20 billion in milestone-based payments. This follows two previous $4 billion rounds in 2023 and 2024, bringing Amazon's total potential capital injection into the startup to $33 billion.
Read: Tim Cook steps down as Apple CEO and passes reigns to John Ternus
The agreement centres on a symbiotic relationship between Anthropic's Claude models and Amazon Web Services (AWS). On Anthropic's side, the commitment is staggering: the company has pledged to spend more than $100 billion on AWS technologies over the next ten years.
This capital will primarily fund the massive compute power required to train the next generation of frontier AI models. As part of the deal, Anthropic has secured access to up to 5 gigawatts of power capacity, roughly enough to power 3.75 million homes, to support its current and future data centres.
The partnership also solidifies Amazon's position as a chip designer. Anthropic will continue to use Amazon's custom Trainium silicon to train its models, providing a critical "real-world" validation of Amazon's hardware as an alternative to Nvidia's dominant GPUs.
For enterprise customers, the deal results in a more frictionless experience:
This deal is widely viewed as Amazon's counter-offensive to the partnership between Microsoft and OpenAI. By locking Anthropic into a $100 billion spending commitment, Amazon ensures that one of the world's most advanced AI labs will remain tethered to its cloud ecosystem through 2036.
While the $20 billion in milestone payments depends on Anthropic achieving specific technical and commercial breakthroughs, the sheer scale of the 5-gigawatt power reservation suggests that both companies expect the demand for Claude's capabilities to scale exponentially.