SK hynix held an "HBF Spec. standardization consortium kickoff" event with Sandisk on the 25th (local time) at Sandisk's headquarters in Milpitas, California, and announced a global standardization strategy for HBF (High Bandwidth Flash), a next-generation memory solution aimed at the age of artificial intelligence (AI) inference.
SK hynix said, "Together with Sandisk, we will establish HBF as an industry standard to build a foundation for the entire AI ecosystem to grow together," adding, "We will form a dedicated workstream with Sandisk under OCP for key tasks and begin full-fledged standardization work." OCP is the world's largest open data center technology consortium, and a workstream is a collaborative framework under OCP that operates around a specific technical topic.
Recently, the AI industry's center of gravity has been shifting from the "training" stage of building large language models (LLMs) to the "inference" stage that provides actual services. With the existing memory architecture alone, it is difficult to simultaneously meet the demands for large-scale data processing and power efficiency required in the inference stage. HBF has emerged as an alternative that can solve these limitations.
HBF sits between HBM, an ultra-high-speed memory, and SSDs, a high-capacity storage device, to secure both capacity expansion and power efficiency required in inference. While conventional HBM handles top-tier bandwidth, HBF complements it.
In particular, HBF is expected to enhance the scalability of AI systems while reducing total cost of ownership (TCO). The industry expects demand for composite memory solutions including HBF to expand in earnest around 2030.
In the AI inference market, system-level optimization that spans CPUs, GPUs, memory, and storage determines competitiveness more than the performance of a single chip. As a result, the ability to provide both HBM and HBF is becoming important. SK hynix and Sandisk plan to take the lead in accelerating the standardization and commercialization of HBF, leveraging their design and packaging technologies and high-volume manufacturing experience built in HBM and NAND.
Ahn Hyun, president and chief development officer (CDO) of SK hynix, said, "The core of AI infrastructure goes beyond performance competition in a single technology to optimizing the entire ecosystem," adding, "Through HBF standardization, we will establish a cooperative framework and create new value by presenting an optimized memory architecture for customers and partners in the AI era."