SK hynix is preparing for mass production of HBM4 (6th generation HBM), the next-generation high bandwidth memory product, while the possibility of reducing the scale of its facility investment from initial estimates has been raised. Following the emergence of DeepSeek in China, expectations are that the market for artificial intelligence (AI) infrastructure equipped with mid-to-low priced hardware will expand, which could lead to the demand for HBM3E (5th generation HBM) with high versatility lasting longer than originally anticipated.
According to industry sources on the 28th, SK hynix will start mass production of HBM4 in the second half of this year and plans to expand supply in conjunction with NVIDIA's next-generation graphics processing unit (GPU) "Rubin." However, forecasts suggest that the revenue proportion from HBM3E, which has higher versatility than the ultra-high-performance HBM4, will be greater, leading to a conservative approach toward expanding HBM4 facility investments.
Above all, the uncertainty regarding the future outlook of its largest customer, NVIDIA, is the greatest source of concern. NVIDIA has been ramping up GPU specifications to the limits, following the Blackwell series it started shipping last year, with products such as Rubin this year and Feynman next year, while encouraging large-scale AI infrastructure investments by global IT corporations.
However, following the emergence of DeepSeek, major American big tech corporations are increasingly perceiving that investments in AI infrastructure based on NVIDIA GPUs are excessive in terms of total cost of ownership (TCO). This context is underscored by Broadcom, a strong player in the application-specific integrated circuit (ASIC) sector, which has recently secured successive contracts from companies like Google and Meta by leveraging cheaper and optimized AI semiconductor designs.
Kwak Noh-jung, president of SK hynix, said the day before at the SK hynix 2025 annual shareholders' meeting, "It is true that there were concerns that demand for high-spec or high-capacity AI memory products would weaken due to DeepSeek's optimization of AI computational efficiency," adding, "If a diverse AI ecosystem optimized for demand is activated, it will help increase demand in the medium to long term." He further noted, "We need to observe demand and market conditions regarding HBM4 facility investments and discuss with customers as mass production progresses in the second half of the year."
As the adoption of the latest products from NVIDIA among big tech companies becomes more conservative, forecasts that the demand for the existing versatile HBM3E will last longer have also started to gain traction. In this context, SK hynix unveiled the HBM3E 16-layer product, an enhancement of the HBM3E product's performance to its maximum, at the "CES 2025" held earlier this year in Las Vegas. This is currently the top-tier product in the HBM3E portfolio, where HBM3E 8-layer and 12-layer products dominate the market.
Sources familiar with SK hynix noted, "Internally, there have been projections that demand for HBM3E could last longer than initially expected, and to prepare for this, the product that maintains an overwhelming position in the competitive HBM3E sector among the three companies is precisely the HBM3E 16-layer product," adding, "For AI companies or server firms seeking versatile HBM, the HBM3E 16-layer product will become a 'killer product.'"
Meanwhile, SK hynix is the first in the industry to supply HBM4 12-layer samples to customers and has entered certification procedures. The company plans to complete mass production preparations by the second half of this year. SK hynix aims to enhance performance by applying the logic frontend process of Taiwan's TSMC on the base die (the component mounted at the bottom) of HBM4.