Park Kyung, SK hynix executive vice president (head of Biz Insight), gives a presentation on "The evolution of AI service infrastructure and the role of memory" on day two of SK AI Summit 2025 at COEX in Gangnam-gu, Seoul. /Courtesy of Jeong Doo-yong

Memory-first optimization is now bound to be at the heart of architecture (design).

Executive Vice President Park Kyung (business insight) and Executive Vice President Joo Young-pyo (system architecture), known as the "brains" at SK hynix, said this during their day two session presentation at SK AI Summit 2025 at COEX in Gangnam-gu, Seoul, on the 4th. They said that as artificial intelligence (AI) takes hold as the key growth engine of industry, the memory market is also shifting to a "customer-tailored" model.

The business insight organization led by Park identifies core trends such as technology and competition in the memory market to support key decision-making. By studying system changes driven by the evolution of AI services and clarifying the growth direction of the memory market, it is called "the place that establishes SK hynix's future strategy." The system architecture organization led by Joo is the core development unit that executes the future strategy. Practical research and development (R&D) of next-generation system architecture technology centered on memory takes place here.

◇ Shift from simple supplier to "memory creator"

Park gave a presentation on the theme "Evolution of AI service infrastructure and the role of memory." He said, "We are seeing a shift from agent AI to physical AI (AI combined with physical systems such as robots)," adding, "We don't know how much the world will change, but it's the speed and scale that are in question; the direction (of business shifting around AI) is irreversible."

The AI Semiconductor market that implements these changes is also growing rapidly. Park said, "By 2030, the global data center market is expected to reach more than $980 billion (about 1,410.808 trillion won), and AI-oriented data centers will account for about 84% at $823 billion (about 1,184.7908 trillion won)." He added, "In 2030, the memory semiconductors needed are projected to be about 41 million wafers, but supply capacity will be just 31 million," saying, "With this structural change, the memory semiconductor market, which has been centered on price competition, will also change." "The memory semiconductor industry will gain strong leverage to express a wider range of choices and create conditions to lead technology," he said.

Park also predicted that as AI services develop around "inference" and become increasingly compact, the memory semiconductor market will shift to customer-tailored products. He said, "For AI training, the required memory is somewhat predictable, but in inference, memory cannot be fixed. That's because the amount of memory required surges as the number of connections grows and the depth of reasoning deepens," adding, "Unlike training, inference makes everything a target for optimization—GPU memory, host memory (main memory), and solid-state drives (SSD)." He added, "Memory performance, capacity, and configuration determine service efficiency."

These changes are also appearing on factory floors. Rubin, NVIDIA's next-generation AI Semiconductor platform recently unveiled for mass production in 2026, equips the GPU with high bandwidth memory (HBM)4, the central processing unit (CPU) with LPDDR5X (low-power DRAM), and the CPX (an AI inference-focused accelerator) with GDDR7 (graphics DRAM). Park said this is "a pattern we haven't seen before," explaining, "The cookie-cutter memory structure of attaching DRAM and NAND to an x86 CPU is breaking, and an 'era of combinations' is opening in which memory must be placed by function."

Park forecast that with these changes, the memory market will "move beyond the era of entering the market with cost competitiveness through a small variety of mass-produced products, and shift toward creating customer value together and providing solutions that resolve customer needs."

Joo Young-pyo, SK hynix executive vice president (head of System Architecture), gives a presentation on "The necessity and direction of collaboration with system companies from the perspective of memory corporations" on day two of SK AI Summit 2025 at COEX in Gangnam-gu, Seoul. /Courtesy of Jeong Doo-yong

Joo, who gave a presentation on "The need and direction of collaboration with system companies from a memory company's perspective," also said, "In a diversified AI service environment, a memory structure optimized for each stage's requirements is needed." He explained, "An optimized memory structure is difficult to realize without joint planning and participation from the earliest stages of system design," adding, "Through collaboration, an integrated approach that considers structural design, interfaces, and even power and thermal characteristics together is emerging as a must-do."

SK hynix is focusing intensely on developing technologies to timely supply custom memory semiconductors by client in the era of AI inference. In a presentation the day before, President Kwak Noh-jung of SK hynix also said, "Beyond memory providers (suppliers), a 'full-stack AI memory creator' is our new aim," adding that the company will expand customer-tailored product lines in DRAM and NAND, including ▲custom HBM ▲AI-D (DRAM) ▲AI-N (NAND).

※ This article has been translated by AI. Share your feedback here.