As the spread of artificial intelligence (AI) accelerates, simply boosting graphics processing unit (GPU) performance in data centers is no longer enough to compete on speed. To run large-scale AI models, the required data must be fetched instantly, and if this process slows, the entire system becomes bottlenecked.

SSD (solid-state drive) controller corporations FADU said at a technology briefing held at its headquarters in Samseong-dong, Seoul, on the 4th that it is presenting next-generation SSD and power semiconductor strategies to respond to these changes, explaining, "In the AI era, storage and power efficiency determine system performance."

SSD on display at the FADU booth at OCP Korea Tech Day./Courtesy of News1

FADU has been delivering results this year in improving performance and diversifying its global customers. In the first quarter, FADU's revenue from major customers jumped more than tenfold, from 3.2 billion won in the previous quarter to 33.4 billion won. The industry views the key customer as SanDisk. In addition, FADU recently won a large order worth more than 60 billion won and is rapidly expanding its presence in the global NAND flash ecosystem, including the United States, Japan and Taiwan.

With AI data center investment and customer diversification picking up speed, FADU's strategic direction is also clear. As it becomes harder to boost semiconductor performance through process scaling alone, the era has arrived in which application-specific hardware determines system performance. In particular, with data exploding to the point that data generated over the past two years accounts for 90% of all human data, corporations are adopting total cost of ownership (TCO)—including not only equipment price but also power, cooling and maintenance—as the core competitive metric. FADU explained, "A structure that uses less power while supplying data faster is needed."

As AI models become more complex, the role of SSDs is expanding. GPU and high bandwidth memory (HBM) speeds are fast, but SSDs and networks cannot keep up, creating bottlenecks. In particular, as retrieval-augmented generation (RAG) and vector DB-based services increase, how fast and reliably an SSD reads random data (QoS/IOPS) determines AI service quality. FADU said, "High-performance SSDs that supply data instantly next to GPUs will become a key pillar of AI infrastructure going forward."

FADU is improving power efficiency by moving away from software-centric SSD architectures and implementing frequently used core operations in hardware. It directly designs core IP so it can support various NAND, and has implemented an architecture that maximizes performance within limited power (23–24W).

Plans for next-generation SSDs were also unveiled. The current Gen6 SSD delivers more than double the performance of the previous generation while reducing power. With the next-generation Gen7 SSD, FADU aims to achieve much faster read and write speeds (100 million IOPS). To that end, it plans to increase internal SSD processing speeds and apply technologies that automatically correct errors. It is also developing a method in which the SSD accesses required data directly without going through a GPU or CPU (D2D). This, it explained, would shorten unnecessary paths and speed up AI workloads.

The power management integrated circuit (PMIC) business is also emerging as FADU's second growth pillar. In data centers, even a slight improvement in power efficiency yields large expense savings, driving rapid growth in PMIC demand.

FADU has already developed PMICs, semiconductors that manage power to ensure SSDs operate stably, and is securing certifications from global customers. It said it is establishing its position in the market with a companion PMIC approach that designs and supplies SSD controllers and power semiconductors as a set.

Chief Technology Officer Nam I-hyeon said, "Korea is the world leader in memory and foundry, but system layers such as SSDs, Neural Processing Unit (NPU) and server software are still empty," adding, "FADU aims to fill this gap and become a full-stack system semiconductor corporations that directly designs core components of AI infrastructure."

He added, "In the AI era, power and data processing efficiency determine competitiveness," and said, "We will build an ecosystem with partners that lowers TCO."

※ This article has been translated by AI. Share your feedback here.