After NVIDIA unveiled its personal supercomputer "Digits," major PC manufacturers such as Lenovo and HP have reportedly requested technical collaboration with Samsung Electronics and SK hynix to secure a new type of memory module technology for next-generation artificial intelligence (AI) PCs.
According to industry sources on the 24th, Samsung Electronics and SK hynix are conducting internal tests of various AI PC DRAM modules, including the Small Outline Compression Attached Memory Module (SOCAMM), tailored to customer requests. The DRAM module is a product that connects multiple DRAMs to the device's computing system, and its memory bandwidth, power consumption, and performance vary depending on the module's structure.
Previously, the dual in-line memory module (DIMM) dominated the PC industry, but there is now a need for a new form factor to improve low latency, high bandwidth, and power efficiency for AI functionalities. Currently, the most promising next-generation memory module is the SOCAMM. This technology, proposed by NVIDIA, is a new technology targeting high-performance computing and the AI market.
The SOCAMM implements a method of stacking DRAM vertically, allowing for more modules to be integrated into a smaller design compared to existing products. Since it is designed to allow detachable modules, users can continuously upgrade PC performance by replacing the memory. In simpler terms, it can fit more memory into PCs based on its 3D structure, is advantageous for optimization, and allows for easy upgrades from both consumer and manufacturer perspectives.
Park Jun-young, a researcher at Hyundai Motor Securities, noted that "the SOCAMM is a module that integrates the system on chip (SoC) and memory in a single package, which is gaining attention due to the increasing need for high bandwidth, low power consumption, and compact form factors in AI devices. Because the physical distance between the SoC and memory is short, it can improve communication efficiency between logic and memory, thus achieving high bandwidth and low latency characteristics."
Additionally, various types of memory modules such as Low Latency Wide I/O (LLW) and Low Power Compression Attached Memory Module (LPCAMM) are being discussed for AI device memory modules. Among them, LPCAMM, developed by Samsung Electronics, is noted for reducing the footprint by over 60%, thereby increasing design flexibility for PCs and laptops, and allowing for more efficient use of internal space, including the additional capacity for batteries.
LPCAMM also reduces the footprint by over 60% compared to So-DIMM, enhancing the design flexibility for PCs and laptops and allowing for more efficient use of internal space, including the additional capacity for batteries. Like SOCAMM, it offers detachable features for manufacturing flexibility while improving convenience for users with options for replacement and upgrades. LPCAMM is expected to enhance performance by up to 50% and power efficiency by up to 70% compared to existing Small Outline Dual In-line Memory Module (So-DIMM), leading to expanded applications in artificial intelligence (AI), high-performance computing (HPC), servers, and data centers.
An industry insider stated, "Although the SOCAMM is not yet widely adopted, it appears that initial work to facilitate its adoption has begun. As the AI industry develops rapidly and AI devices permeate everyday life, SOCAMM, LPCAMM, and LLW are expected to play significant roles as next-generation memory solutions."