Samsung Electronics' fifth-generation high bandwidth memory (HBM3E) 12-stack./Courtesy of Samsung Electronics

As Nvidia is expected to begin in earnest exporting its H200 artificial intelligence (AI) chip equipped with fifth-generation high-bandwidth memory (HBM) HBM3E to China, memory semiconductor corporations such as Samsung Electronics and SK hynix are said to have raised next year's HBM3E supply prices. Orders have surged from corporations that design their own AI accelerators, including not only Nvidia but also Google and Amazon. It is also understood that the price increase was influenced by memory semiconductor corporations concentrating on expanding production capacity for sixth-generation HBM (HBM4), for which demand is expected to jump next year, leaving them with limited bandwidth to respond to HBM3E.

According to the industry on the 24th, memory semiconductor corporations including Samsung Electronics and SK hynix are known to have raised HBM3E supply prices by nearly 20%. Analysts said this is unusual, given that prices usually fall ahead of the launch of next-generation HBM. The prevailing view is that the revision upward of next year's HBM3E order volumes by Google and Amazon, which design their own AI accelerators, including Nvidia, which had been the largest HBM3E customer, had an impact.

An industry official said, "This year HBM3E established itself as the main product in the HBM market, but prices were expected to ease somewhat next year when the HBM4 market opens," while noting, "However, as global big tech corporations other than Nvidia release AI accelerators equipped with HBM3E next year, demand for HBM3E continues to grow. From the perspective of memory semiconductor corporations, they cannot give up expanding HBM4 production capacity, so supply is failing to keep up with demand, resulting in a price premium of around 20%."

With exports of Nvidia's H200 AI chip to China allowed, HBM3E demand appears to have expanded more than expected. Nvidia's H200 mounts six HBM3E units per device. On the 22nd (local time), Reuters said, "Nvidia plans to handle initial orders with existing inventory, and total shipments are expected to be 5,000 to 10,000 chip modules (about 40,000 to 80,000 H200 units)," adding, "Nvidia has informed Chinese customers of plans to expand new production capacity for the chip and is expected to begin taking related new orders from the second quarter of next year."

Global big tech corporations such as Google and Amazon also need HBM3E. That is because Google's TPU (tensor processing unit) and Amazon's Trainium equipped with HBM3E will begin shipping next year. Both products will increase HBM capacity by nearly 20% to 30% compared with the previous generation. Google's seventh-generation TPU is known to mount eight HBM3E units per device, and Amazon's Trainium3 to mount four HBM3E units.

As Samsung Electronics and SK hynix are expected to focus on producing the next-generation HBM4, HBM3E demand is outstripping supply. Kim Dong-Won, an analyst at KB Securities, said, "Next year, HBM4 is forecast to account for 55% of HBM market revenue and HBM3E 45%, and from the third quarter next year HBM4 is expected to rapidly absorb HBM3E demand."

With the surge in HBM demand coinciding with continued price increases for mainstay products such as DRAM, expectations for next year's earnings at Samsung Electronics and SK hynix are also rising. According to financial data firm FnGuide, Samsung Electronics' forecast for next year's annual operating profit was raised to 85.4387 trillion won from 76.6544 trillion won a month ago, and SK hynix's operating profit was revised up to 76.1434 trillion won from 71.4037 trillion won over the same period.

※ This article has been translated by AI. Share your feedback here.