The Amazon Web Services data center in Ashburn, Virginia, United States./Courtesy of Reuters Yonhap

There are growing concerns that prices for high bandwidth memory (HBM) and DRAM, essential resources for artificial intelligence (AI) infrastructure, are soaring as much as three to four times, widening competitiveness gaps among corporations. Big Tech corporations such as OpenAI, Google, and Amazon hold an advantage in the supply chain through large-scale advance purchases, while AI service corporations pushed down the priority list are forced to shoulder the full burden of surging semiconductor prices. Some in the industry are even interpreting this as "the new barrier to entry in the AI era is memory."

On the 11th, according to major foreign outlets including Fortune and The Verge, large-scale supply contracts signed by U.S. Big Tech including OpenAI and by Samsung Electronics and SK hynix are being cited as a primary cause of the current severe shortage of commodity DRAM. In fact, in Oct., OpenAI signed a letter of intent (LOI) and released that it would join Stargate, a global AI infrastructure project. Through this, it is known to be buying up to 900,000 wafers of DRAM per month. That exceeds half of the global monthly DRAM output of 1.5 million wafers.

In addition, Google and Meta are procuring memory semiconductors from Samsung Electronics and SK hynix in large volumes through Broadcom. In particular, Samsung Electronics has sharply increased its deliveries to Broadcom starting with fifth-generation HBM (HBM3E), forming a two-top structure with SK hynix. The two corporations are analyzed to be allocating about 40% to as much as half of their total DRAM production to HBM.

Memory semiconductors are as critical as graphics processing units (GPUs) in determining AI training and inference expense. Over the past one to two years, HBM prices have jumped as much as three to four times, and commodity DRAM prices have climbed to nearly double, sending expense burdens soaring like a snowball. As memory suppliers shift production lines to focus on HBM and global AI corporations lock in large volumes, the supply that general corporations can purchase has shrunk significantly.

IT corporations with weaker purchasing power than Big Tech are finding it difficult to obtain not only HBM but also commodity DRAM without paying excessive expense. This trend is a direct blow to domestic platform corporations. Naver and Kakao have their own data centers, but they still must continuously upgrade core hardware such as external GPUs and HBM for large-scale model training.

As memory semiconductor purchase expense soars without limit, AI infrastructure investment by domestic platform corporations is highly likely to stall. A source familiar with Naver said, "Aside from Big Tech, there are no corporations that can keep absorbing the currently skyrocketing semiconductor purchase expense while keeping large language model (LLM) training infrastructure up to date," and noted, "If training expense through in-house models increases exponentially, domestic corporations are highly likely to take a defensive strategy that supports only optimization while relying on Big Tech for core models."

In search, advertising, recommendation, and commerce—fields where AI quality determines service competitiveness—this burden is even more evident. While Big Tech in the United States and Europe continuously upgrades ultra-large models based on the latest HBM, Korean corporations may find it difficult to keep pace, raising concerns that service quality gaps could widen in the mid to long term.

The startup ecosystem is also likely to be shaken. As small and midsize corporations abandon developing their own models and increase reliance on overseas providers such as OpenAI and Google, Korea's technological self-reliance declines, and platform corporations become structurally subordinate to global AI suppliers. In fact, many startups released business readjustments between last year and this year, citing the burden of AI training expense. An industry official said, "The surge in memory semiconductor prices is not just a cost burden; it is a new 'capital barrier' in the AI era," and explained, "It is a typical supply chain polarization in which differences in procurement capability lead to technology gaps."

※ This article has been translated by AI. Share your feedback here.