Anthropic has quickly scaled up on the back of the success of its coding assistant Claude, emerging as a key player in the artificial intelligence (AI) model market. According to the industry, Anthropic's annual recurring revenue (ARR) recently topped $30 billion, prompting assessments that it has risen to a level that threatens OpenAI, whose momentum had once slowed.

Illustration = ChatGPT DALL·E 3

Behind this growth lies a sophisticated "AI infrastructure alliance" linking Anthropic, Google, and Broadcom that goes beyond a simple model performance race. Behind the flashy model competition, what ultimately dictates where the money flows is control over infrastructure. Beyond who builds the smarter AI, the key to deciding the market's real revenue structure is who designs and supplies the compute resources to run those models.

Anthropic's growth directly translates into surging compute demand. In Generative AI, expense increases exponentially as the number of tokens generated rises. Because of this, Anthropic is currently leasing Google Cloud's TPU (AI accelerator) infrastructure in large volumes and is reportedly planning, in the mid to long term, to internalize infrastructure by directly installing roughly 3.5 gigawatts (GW) of Google-Broadcom joint TPU racks in its own data centers starting in 2027. 3.5 GW is equivalent to the power output of three to four nuclear power plants, an overwhelming scale for AI infrastructure investment by an individual company.

The biggest beneficiary in this structure is clearly Broadcom. Broadcom is a core partner deeply involved in developing Google's TPUs and is effectively a co-designer participating from the chip design stage, going beyond a simple parts supplier. Whether Anthropic rents Google Cloud or builds its own servers, the hardware traces back to Broadcom's application-specific integrated circuits (ASICs) and ultra-high-speed networking technology. In fact, the market expects Broadcom to generate $21 billion in AI-related sales in 2026 and $42 billion (about 5.6 trillion won) in 2027 through this partnership with Anthropic. Each time Anthropic leaps forward, Broadcom's coffers effectively double.

Google's strategy is also meticulous. It is pursuing a dual "frenemy (friend and enemy)" strategy by tying Anthropic—its rival to the in-house AI model Gemini—to its cloud as a customer while simultaneously expanding the TPU ecosystem. Even if it cedes some share in the model race, the calculation is to lock in infrastructure revenue by achieving economies of scale that lower the manufacturing cost of TPUs through a major customer like Anthropic. The recent long-term supply agreement between Google and Broadcom through 2031 also attests to the strength of this ecosystem.

Ultimately, leadership in the AI industry is shifting from a model performance race to a contest over infrastructure control. The more model corporations secure users and pump out tokens, the more the value of the chips and network supply chains that support them from behind is reinforced.

This shift carries major implications for Korea's semiconductor industry. Until now, the AI Semiconductor market has grown around Nvidia graphics processing units (GPUs) coupled with high-bandwidth memory (HBM). But if the ASIC ecosystem centered on Google's TPUs spreads on the back of large customers like Anthropic, there is a possibility that the very structure of memory demand will change.

A semiconductor industry official said, "Anthropic's growth may look on the surface like a victory for models, but in substance it is becoming a turning point that cements the custom chip ecosystem designed by Google and Broadcom as a counterweight to Nvidia," adding, "Domestic corporations must break out of a Nvidia-centric supply chain and proactively respond to the tectonic shifts in the memory market that the expansion of the ASIC coalition will bring."

※ This article has been translated by AI. Share your feedback here.