
SK hynix announced on the 19th that it will operate a booth at the global AI conference 'GTC 2025' hosted by NVIDIA in San Jose, California, from the 17th to the 21st (local time), under the theme 'The Tomorrow of AI Driven by Memory.'
SK hynix explained that it will showcase a variety of memory products that will lead the AI era, including high-bandwidth memory (HBM), memory solutions for AI data centers, on-device applications, and the automotive sector.
SK hynix said, 'In addition to the HBM3E 12-layer, we will also showcase SOCAMM, which is gaining attention as a new memory standard for AI servers, demonstrating our leading AI memory technology.' SOCAMM is a memory module specialized for AI servers based on low-power DRAM.
The event is expected to see the attendance of key executives, including President Kwak Noh-jung, President Kim Joo-seon of AI Infrastructure, and Vice President Lee Sang-rok, to strengthen cooperation.
SK hynix, which is currently supplying the world's first 5th generation HBM (HBM3E) 12-layer product to customers, plans to complete preparations for mass production of the HBM4 12-layer product by the second half of this year and begin supply as per customer demand. A model of the developing HBM4 12-layer product will also be displayed at the exhibition.
Kim Joo-seon, CEO of SK hynix, said, 'I think it is meaningful to present leading products for the AI era at this GTC,' and added, 'We will accelerate the future through differentiated AI memory competitiveness.'
Meanwhile, on the day, SK hynix announced that it had provided samples of the ultra-high-performance DRAM 'HBM4 12-layer' for artificial intelligence (AI) to major customers for the first time in the world. The customers who received the samples are estimated to be U.S. big tech corporations such as NVIDIA and Broadcom.
SK hynix stated, 'Based on our technological competitiveness and production experience that have led the HBM market, we plan to ship HBM4 12-layer samples ahead of the original schedule and begin certification processes with customers,' adding, 'We will also finalize mass production preparations by the second half of this year to solidify our position in the next-generation AI memory market.'
SK hynix explained that the new HBM4 12-layer product is the world’s best in terms of speed and capacity. It has achieved a bandwidth capable of processing more than 2TB (terabytes) of data per second for the first time.
This equates to the ability to process the data of more than 400 FHD-level movies of 5GB (gigabytes) each in just one second, representing a speed increase of over 60% compared to the previous generation (HBM3E). Furthermore, it has implemented the advanced MR-MUF process, which has proven effective in HBM3E, to achieve a maximum capacity of 36GB based on the HBM 12-layer standard.