MLA100 (NPU PCIe Card) equipped with Mobilint's AI semiconductor ARIES./Courtesy of Mobilint
MLA100 (NPU PCIe Card) equipped with Mobilint's AI semiconductor ARIES./Courtesy of Mobilint

Amid the accelerating competition for artificial intelligence (AI) supremacy, the use of AI is expanding across various fields such as automobiles, robotics, drones, and closed-circuit television (CCTV). In particular, since it is difficult to install graphics processing units (GPUs) in these fields, hardware and solutions based on neural processing units (NPUs) are essential. With signs that the market is set to expand in earnest, it is anticipated that domestic and foreign semiconductor 'sleeping giants' will begin to grow through initial public offerings (IPOs).

According to industry sources on the 1st, many edge AI semiconductor and solution corporations are preparing to go public starting this year. Mitac Digital Technology (MDT), a subsidiary of Taiwan's Mitac Holdings, revealed at a recent media conference, "We have been sharpening our knives for the past 20 years," and stated, "We are preparing for an IPO to strengthen our edge AI business." In addition, domestic NPU sector representative startups, including DeepX and Mobilint, and Nota, an on-device generative AI company, are said to be preparing for their listings.

On-device AI refers to performing the calculations needed for AI models on the device itself without internet consolidation. This concept emerged to address and supplement the problems associated with the traditional method that relies on data centers and internet consolidation, contrasting with the currently cloud-based AI.

On-device AI offers advantages in security and privacy since data is processed on the device itself without being sent to the cloud. Global big tech corporations, including Samsung Electronics, Apple, and Microsoft, are competing to develop on-device AI technology and launch products. According to market research firm MarketsandMarkets, this market is expected to grow at an average annual rate of 37.7%, reaching approximately 240 trillion won by 2030.

While AI accelerators for servers are fiercely competed over by large fabless semiconductor design corporations like NVIDIA, AMD, and Broadcom, there are currently no significant players in the on-device AI NPU market, making it an untapped field. In particular, in South Korea, the timing of DeepX's listing, which is gaining attention in the AI semiconductor market as the 'next Qualcomm,' is crucial. The success of its first mass-produced chip, DX-M1, to be released this year is expected to be a turning point for the IPO timing. DeepX has conducted verification of prototype technologies requested by over 300 corporations worldwide through an early customer support program over the past year, and sales are expected to begin in earnest in the second half of this year.

Industry insiders suggest that the supply performance of DeepX's DX-M1 and feedback from customers will determine the timing of DeepX's IPO. An industry source noted, "DeepX's competitiveness lies in its bold choice of the 5-nanometer process at an early stage, unlike other NPU corporations that prefer mature processes, with a high yield of 90%." This indicates that it has secured a favorable position in terms of low power consumption, high performance, and price competitiveness.

Another promising NPU company, Mobilint, has entered mass production with its self-developed NPU 'Eris' and aims to begin deliveries in the second half of this year. Its second product, Regulust, has already provided samples to customers last year, with plans to accelerate mass production within this year. However, the key point is performance. Currently, with no significant sales generated, Mobilint is expected to proceed with its IPO only after securing up to 30 billion won in revenue.

Park Jong-won from the LG Economic Research Institute stated, "The trend of developing lightweight models to implement AI functions that users need anytime and anywhere will likely continue for some time. As big tech corporations develop larger and smarter AI models, the performance of on-device AI models is expected to continuously rise and be able to perform more functions."