AI industry infrastructure investment by all corporations will surpass $1 trillion in 2027. In particular, memory semiconductors and packaging are becoming important in AI infrastructure, so the strengths of Korea's semiconductor ecosystem will stand out.
Clark Cheong, senior director at the International Semiconductor Equipment and Materials Association (SEMI), said this at a press briefing for Semicon Korea 2026 at COEX in Gangnam District, Seoul, on the 11th. Hosted by SEMI, Korea's largest semiconductor exhibition, Semicon Korea 2026, opened that day. The event runs through the 13th. This year's Semicon Korea, under the theme "Transform Tomorrow," not only uses the entire COEX venue but also expanded into nearby hotels. There are 550 participating corporations and more than 2,400 booths.
AI infrastructure expenditure by the four major cloud service providers (CSPs)—Microsoft, Google, Amazon, and Meta—is increasing rapidly. Cheong said, "Next year, AI infrastructure investment will surpass $1 trillion," and noted, "From 2024 to 2028, the annual growth rate of expenditure will reach 38%."
Driven by AI demand, the DRAM market is expected to face a prolonged supply shortage. According to SEMI, the annual growth rate of DRAM production capacity is projected at 4.8% from last year through 2030. Cheong explained, "Memory suppliers are making new investments conservatively, and a significant portion of capacity increases will be absorbed by High Bandwidth Memory (HBM)," adding, "With investment concentrated in advanced areas, supply of legacy and specialty DRAM will be more limited."
As AI demand increases faster than expected, the outlook for DRAM production capacity over the next three years could be revised upward. The scale of fab investment by major domestic corporations such as Samsung Electronics and SK hynix is expected to expand. Cheong said, "Korea's fab investment will expand significantly to about $40 billion annually from 2026 to 2028," adding, "More than 80% will be related to DRAM and NAND, with some advanced logic investment to be carried out in the United States."
As AI Semiconductor supply increases, demand is also expected to surge for HBM, the memory semiconductor that is essential for these chips. Lee Se-chul, managing director at Citigroup, said, "Semiconductor demand has shifted from PCs to mobile, data centers, and AI," adding, "With the spread of AI, memory is becoming central, so HBM demand is also soaring."
As AI infrastructure investment continues to expand, some expect that efforts to resolve the memory bottleneck that occurs during AI computation will spread not only to HBM but also in succession to DRAM and NAND flash. In the current structure, HBM mounted on GPUs plays this role. However, as the inference domain expands, capacity that should be used for HBM's original purpose—computation—has become insufficient, creating a memory bottleneck.
Recently, Nvidia announced that in its next-generation architecture, "Vera Rubin," it will newly create a platform called "Inference Context Memory Storage," a dedicated storage space for KV cache that lets AI remember the context of a user's conversation. In his presentation, Lee said, "Recently, demand for 'key-value (KV) cache' to run AI has been increasing, and semiconductor architectures are changing," adding, "As demand for memory semiconductors diversifies, not only HBM but also general-purpose DRAM and NAND will grow."