Google will supply up to 1 million AI chips to the artificial intelligence (AI) Start - Up Anthropic, seen as a rival to OpenAI, Bloomberg reported on the 23rd. As a result, Anthropic will secure large-scale computing capacity from Google.
Anthropic said in materials released that day that it will expand the use of Google Cloud technologies, including up to 1 million TPUs (tensor processing units), to expand the scope of AI research and product development, and noted that the expansion is worth tens of billions of dollars (tens of trillions of won) and is expected to operate capacity well above 1 gigawatt (GW) next year.
The industry estimates the expense of building a 1 GW data center at about $50 billion (about 72 trillion won). Typically, about $35 billion of that is spent on AI chips.
Anthropic currently has more than 300,000 corporate customers. The number of large customers accounting for more than $100,000 in annual revenue has nearly doubled sevenfold over the past year. Anthropic said the expanded use of Google Cloud will help meet surging customer demand.
Anthropic said its computing strategy focuses on a multifaceted approach that efficiently leverages three chip platforms: Google's TPU, Amazon's Trainium, and Nvidia's GPU (graphics processing unit). The company added that this multi-platform strategy ensures Anthropic can continue to improve Claude's performance while maintaining strong partnerships across the industry, and that it will continue its collaboration with Amazon.
Google's TPU is a chip specialized for AI and Machine Learning tasks, developed in-house by Google. Google did not mention how Anthropic will pay for TPU usage.
Thomas Kurian, CEO of Google Cloud, said Anthropic highly values the powerful performance and efficiency of TPUs and has decided to greatly expand their use, adding that Google is building a mature AI accelerator portfolio based on Ironwood, its seventh-generation TPU specialized for inference.