SK hynix is understood to be supplying fifth-generation High Bandwidth Memory (HBM) (HBM3E) for Microsoft (MS)'s latest artificial intelligence (AI) chip, the Maia 200. As the market for custom application-specific integrated circuits (ASICs) expands among global big tech companies including Amazon, in addition to Nvidia and AMD, demand for HBM is diversifying, and competition with Samsung Electronics over this year's HBM market is expected to intensify.
According to the industry on the 27th, SK hynix is said to be the exclusive supplier of HBM3E for the Maia 200 AI accelerator that Microsoft unveiled on the 26th (local time). Built on TSMC's 3-nanometer (nm; 1 nm = one-billionth of a meter) process, the Maia 200 uses a total of 216GB of HBM3E, with six of SK hynix's 12-high HBM3E stacks onboard.
Not only MS, but also Google's seventh-generation tensor processing unit (TPU) "Ironwood" and Amazon's third-generation "Trainium" are driving diversification of HBM customers as corporations manufacture their own AI chips to reduce reliance on Nvidia. Both Samsung Electronics and SK hynix supply HBM3E for Google's TPU. The TPU is a chip Google developed with U.S. semiconductor design company Broadcom to run AI, and each TPU carries six to eight HBM stacks.
Following HBM3E, competition is expected to intensify further for the next-generation product HBM4 (sixth generation). Samsung Electronics is in the final stages of quality testing related to HBM4 with Nvidia and is said to begin official deliveries as early as next month. SK hynix is also known to be in final coordination with Nvidia over supply.