SK hynix said on the 20th that it began mass production of the SOCAMM2 192GB, a next-generation AI Server Memory module.

SK hynix says on the 20th it begins mass production of SOCAMM2 192GB next-generation memory modules based on 10nm-class 6th-generation (1c) LPDDR5X low-power DRAM/Courtesy of SK hynix

SOCAMM2 is a memory module designed for server environments based on low-power DRAM (LPDDR) and is used in products such as AI servers.

The product applies LPDDR5X built on a 10-nanometer-class sixth-generation (1c) process, and features improved bandwidth and energy efficiency compared with conventional server RDIMM.

The company said the product was designed to match Nvidia's next-generation platform "Vera Rubin."

It also expects the product to help ease memory bottlenecks that occur during AI model training and inference.

SK hynix plans to strengthen its response to the AI Server Memory market with the product.

※ This article has been translated by AI. Share your feedback here.