Chey Tae-won, chairman of SK Group, delivers a keynote titled "AI Now & Next" during the keynote session of the SK AI Summit 2025 at COEX Auditorium in Gangnam District, Seoul, on the 3rd./Courtesy of News1

"In the future, every individual will have a personal intelligent artificial intelligence (AI) assistant. No single corporations can achieve this, and the partnership between OpenAI and SK is essential—and this is only the beginning." (Sam Altman, OpenAI chief executive officer (CEO))

"SK is establishing itself as a leader in applying Generative AI to real business. We optimized (SK Telecom's customer agent service) with Bedrock, AWS's AI model, and the key performance indicator (KPI) achievement rate rose 37%, while positive customer feedback increased 73%." (Andy Jassy, Amazon Web Services (AWS) CEO)

Chey Tae-won, SK Group chairman, said at the SK AI Summit 2025 held at COEX in Gangnam-gu, Seoul, on the 3rd, "OpenAI requested high-bandwidth memory (HBM) at a scale of 900,000 wafers per month for the world's largest AI infrastructure build-out project. That exceeds double the production capacity of a single corporations," adding, "We continue to collaborate with AWS to develop the most efficient AI technologies." On the day, Chey delivered a keynote speech on the theme of AI now & next, presenting SK Group's AI strategy and the current status of collaboration with global big tech.

Chey projected that AI demand will continue to grow sharply, pointing to ▲ the full-fledged onset of AI inference ▲ business-to-business (B2B) adoption of AI ▲ agent AI ▲ sovereign AI. "Once AI begins full inference, it will think more deeply about the question asked and repeatedly verify its own answers, and in the process, demand for computing will inevitably rise," Chey said. "To improve work efficiency, corporations' adoption of AI will expand, and agents that perform tasks on their own will also spread. Beyond the United States and China, every country in the world is trying to build its own AI and is rolling out government-led investment plans," Chey said.

Chey said SK will establish its foothold in the AI market by providing the most efficient AI solutions. He said the company will use this to resolve the mismatch between demand and supply, a bottleneck for AI proliferation. SK said it will achieve this through its memory semiconductor capabilities, AI infrastructure, and AI utilization. "Supply of AI computing power will find it hard to keep up with demand growth, leading to a mismatch," he said. "In the future, bottlenecks will exist not only in memory semiconductors but also in other components." Chey added, "We are receiving requests from many corporations for memory semiconductor supply," noting, "We are considering how to respond to this problem."

SK is expanding production capacity to meet the exponentially growing demand for memory semiconductors. "We have completed the Cheongju plant, an HBM production base, and plan to begin mass production next year," Chey said. "In 2027, a fab (plant) in the Yongin cluster is scheduled to come online; the site can accommodate four large fabs. One fab is large enough to contain six of the recently completed Cheongju M15x fabs," he explained.

He went on to say, "SK aims to provide the most efficient AI infrastructure solutions by building its own data centers and offering everything from semiconductors to power and energy solutions," adding, "Projects such as SK AI Data Center Ulsan in collaboration with AWS and the Southwest AI Data Center released with OpenAI last month are underway."

Chey also showed confidence in AI technology. He said, "SK hynix found a breakthrough by developing ultra-high-capacity memory chips or by adopting NAND concepts," expressing confidence that "SK hynix's technological prowess has been sufficiently proven in the industry." He added, "Even CEO Jensen Huang no longer talks to us about development speed," saying, "This means we are fully prepared."

※ This article has been translated by AI. Share your feedback here.