Amazon adds more for Anthropic with $25 billion: 5GW of computing power, with a $100 billion-plus AWS commitment

ChainNewsAbmedia

According to an official Anthropic announcement and an official Amazon press release, the two companies expanded their strategic partnership on 4/20: Amazon added an investment of up to $25 billion in Anthropic, while Anthropic committed that over the next ten years its spending on AWS would exceed $100 billion, and that it would obtain up to 5GW of additional compute capacity for training and deploying Claude models.

This is the second wave of major expansion in Anthropic’s compute arms race, following the 4/7 collaboration between Anthropic and Google and Broadcom that secured 3.5GW of TPU compute, and it is also Amazon’s largest single commitment to an AI company. As a result, Anthropic’s valuation is locked at $380 billion.

Investment structure: $5 billion upfront, $20 billion tied to milestones

Amazon’s investment this time is split into two phases: $5 billion will be funded immediately on the day of the announcement, and an additional up to $20 billion will be released in tranches after it is tied to “specific commercial milestones.” Together with the $8 billion previously invested, Amazon’s cumulative investment in Anthropic will reach an upper limit of $33 billion.

The $5 billion in this round is taken at Anthropic’s latest valuation of $380 billion, which is also the first time this valuation has been confirmed by a first-tier investor through a new agreement.

Anthropic commits to $100 billion in AWS spending for the next decade

As consideration for the partnership, Anthropic has committed that its spending on AWS over the next ten years will exceed $500k. The scope covers Trainium custom AI chips for current and future generations, as well as “tens of millions of cores” Graviton general-purpose compute CPUs.

In the announcement, Andy Jassy (Amazon CEO) said: “Anthropic’s commitment to run large language models on AWS Trainium over the next ten years reflects our shared progress on the custom chip path.” Dario Amodei (Anthropic CEO) added: “Users are telling us that Claude is becoming increasingly important to their work, and we must build the infrastructure that can keep up with growth in demand.”

5GW compute roadmap: Trainium2, 3, and 4 all-series locked in

The agreement covers three generations of chips: Trainium2, Trainium3, and Trainium4, and Anthropic will also retain the option to purchase subsequent generations of custom chips. On the schedule, large-scale Trainium2 capacity will come online in 2026 Q2. Large-scale Trainium3 capacity will be rolled out and opened progressively before the end of the year. For the full year, cumulative Trainium2 and Trainium3 will total nearly 1GW.

Project Rainier is the flagship project under both parties’ existing collaboration. The training cluster has currently deployed about 500k Trainium2 chips, serving as the primary infrastructure for training Claude models.

Revenue from $9 billion to $30 billion: the surge in Anthropic demand is the driving force behind the negotiations

In the announcement, Anthropic rarely disclosed its own financial situation. Its annualized revenue for the current year has already surpassed $30 billion. Compared with $9 billion at the end of 2025, it more than tripled within half a year. The number of enterprise customers running Claude on AWS has also exceeded 100k, and Claude is one of the model families with the highest usage on Amazon Bedrock.

Anthropic also acknowledged that the surge in demand has put pressure on infrastructure during peak hours, affecting availability and performance. This is precisely the direct motivation behind the large-scale compute expansion this time. The recent tokenization token dispute for Claude Opus 4.7 and adjustments to usage limits are also related to the reality that “infrastructure has become constrained.”

Three-way compute lock-in: AWS, Google, and in-house chips move forward in parallel

Along with the 3.5GW TPU compute capability secured through the earlier April announcement of Anthropic’s collaboration with Broadcom and Google, Anthropic is currently simultaneously locking in both the AWS Trainium and Google TPU custom-chip roadmaps, while retaining the long-term option for its own custom accelerators. This contrasts with OpenAI’s path of relying mainly on Microsoft Azure and only expanding into AWS more recently.

For Taiwan’s semiconductor supply chain, the large-scale expansion from Trainium2 through Trainium4 means that in the next three to five years Marvell, TSMC advanced packaging, and HBM memory suppliers will continue to take on AWS custom chip orders.

Compute muscle behind the IPO race

This announcement also reinforces the foundation for Anthropic’s IPO narrative. Recent reports disclosed that OpenAI’s annualized revenue has broken through $25 billion, and that in preparation for an IPO, Anthropic—chasing at $19 billion—shows that the revenue gap between the two is narrowing. Official announcements indicate that Anthropic’s annualized revenue has reached $30 billion, and the gap is further narrowing.

For investors, the significance of Amazon’s commitment this time is not in the single-transaction amount, but in deep lock-in such as “$100 billion in AWS spending over the next ten years,” which gives the market a clear anchor for Anthropic’s compute supply and long-term cost structure. Project Glasswing high-threshold model Mythos and the subsequent training compute for the Claude 4.x series will be supported by these new 5GW of capacity.

This article “Amazon boosts investment in Anthropic by $25 billion: 5GW compute, $100B AWS lock-in” first appeared on 鏈新聞 ABMedia.

Disclaimer: The information on this page may come from third parties and does not represent the views or opinions of Gate. The content displayed on this page is for reference only and does not constitute any financial, investment, or legal advice. Gate does not guarantee the accuracy or completeness of the information and shall not be liable for any losses arising from the use of this information. Virtual asset investments carry high risks and are subject to significant price volatility. You may lose all of your invested principal. Please fully understand the relevant risks and make prudent decisions based on your own financial situation and risk tolerance. For details, please refer to Disclaimer.

Related Articles

Pi Network founder May 7 discussed human identity verification in the AI era at Consensus 2026

Pi Network founder Nicolas Kokkalis will attend the Consensus 2026 conference in Miami on May 7, joining an expert panel to discuss online human identity verification issues in the age of artificial intelligence (AI). According to the official event schedule, this panel discussion will bring together experts in the fields of identity, privacy, and digital trust.

MarketWhisper1h ago

DeepX and Hyundai Motor Group Develop Low-Power AI Chip Platform for Robots

South Korea's DeepX and Hyundai Motor Group's Robotics Lab are collaborating on a low-power AI computing platform for real-time robotic applications. Utilizing DeepX's DX-M2 chip, the partnership aims to optimize robotics with reduced costs and power consumption, reflecting a trend towards specialized chips in the industry.

GateNews1h ago

Playdate Bans AI-Generated Art, Music, and Text; Coding Assistants Still Allowed

Panic Inc. announced a ban on generative AI tools for art, music, writing, and dialogue in games for its Playdate console. AI coding assistants are allowed with disclosure. This decision follows backlash over an AI-developed game.

GateNews1h ago

Amazon adds a $5 billion investment in Anthropic, and a 10-year AWS agreement locks in $100 billion in compute power

Amazon announced on April 20 that it would invest an additional $5 billion in Anthropic, bringing its cumulative committed investment since 2023 to $13 billion, and leaving room for additional funding tied to future commercial milestones of up to $20 billion. In return, Anthropic has pledged to invest more than $100 billion over the next decade in AWS infrastructure, in exchange for 5 gigawatts of compute resources.

MarketWhisper1h ago

New York state lawmakers propose an “AI dividend” to address layoffs, with Goldman Sachs estimating a monthly loss of 16k jobs

New York state lawmaker and congressional candidate Alex Bores announced an “AI dividend” program on Sunday, aimed at making direct payments to U.S. citizens when artificial intelligence significantly replaces American workers. The program was announced amid findings from a Goldman Sachs report showing that widespread AI adoption has already led to the loss of about 16k jobs per month in the United States, and tech giants such as Amazon, Meta, Intel, and Microsoft have all announced large-scale layoffs in succession.

MarketWhisper1h ago

Tencent QClaw Overseas Version Starts Beta Testing, Japan Offers $700 Token for Early Access

Tencent Computer Manager’s AI agent product QClaw, under the company’s team, announced on April 20 that its overseas version has officially started internal testing. The first batch covers the United States, Canada, Singapore, and South Korea, supporting multiple languages including Chinese, English, French, Spanish, and Korean. During the internal testing period, 40 million Tokens are given away each day, and the first batch of 20,000 slots for “Founding Claw” is also opened on a first-come, first-served basis.

MarketWhisper1h ago
Comment
0/400
No comments