Anthropic's Expansion and Google Cloud TPUs: A Strategic Move in AI
Anthropic has announced a significant expansion of its use of Google Cloud services, including the deployment of over 1 million Tensor Processing Units (TPUs) by 2026. This move underscores Anthropic's commitment to leveraging cutting-edge cloud infrastructure to enhance its AI capabilities.
The expansion is valued at tens of billions of dollars, with an expected capacity of over 1 gigawatt coming online in 2026. This marks a strategic shift for Anthropic, reflecting the strong price-performance and efficiency of TPUs that Google Cloud has demonstrated over several years.
Anthropic's decision to expand its use of TPUs is part of a broader partnership with Google Cloud, which began in early 2023. This collaboration involves the use of Google's TPUs and cloud services for training and deploying AI models.
In addition to relying on TPUs, Anthropic also uses Amazon Trainium chips alongside NVIDIA GPUs. This multi-platform approach ensures that Anthropic can continue advancing Claude's capabilities while maintaining strong partnerships across the industry.
The expanded compute capacity will assist Anthropic in meeting the ongoing demand for its Claude models, particularly in the business and enterprise sectors. The company has reported serving over 300,000 business customers, with the number of large accounts (those generating more than $100k annually) increasing nearly sevenfold in the past year.
Over the past few months, Anthropic has revealed significant updates to its Claude model lineup, including the launch of Claude 45, Sonnet 45, Haiku, and Opus. These updates highlight Anthropic's focus on efficiency and scalability in AI development.
TPUs are hardware systems developed by Google specifically for AI tasks and are accessible via Google Cloud services. Recently, Google revealed that its latest Gemini 25 models were trained using massive clusters of fifth-generation TPUs, called Ironwood, which is the company's seventh iteration.
Besides Anthropic, Apple was another prominent company that used Google's TPUs for AI workloads. In 2024, Apple announced its use of 8192 TPU v4 chips within Google Cloud to train its Apple Foundation Model, a large language model supporting its AI projects.
