Deputy Prime Minister and Minister of the Ministry of Science and ICT Bae Kyung-hoon, Ha Jung-woo, Senior Secretary for AI Future Planning at the presidential office, and other attendees pose for a commemorative photo at the announcement of the Independent AI Foundation Project at COEX Auditorium in Gangnam-gu, Seoul, on December 30 last year. /Courtesy of News1

AI startup Motif Technologies will take part in the wild-card round of the "national AI model project." The national AI model project is a core government initiative that concentrates support such as graphics processing units (GPUs) and data with the goal of building Korea's own independent AI model to rank among the world's top three in AI.

On the 20th, Motif Technologies said, "We plan to actively participate in the government's additional call for the national AI project," adding, "We are also in talks with additional companies beyond the consortium formed in Jul. last year."

On the 16th, at a briefing on the "independent AI foundation model project first-stage evaluation results," the Ministry of Science and ICT initially sought to narrow five elite teams to four but changed course and eliminated Naver Cloud and NC AI in the first round. The NC AI elite team fell short in benchmark, expert, and user evaluations, while Naver Cloud failed to meet the originality requirement and dropped out of the competition. With two teams eliminated at once, the government said it would give another chance to all corporations, including the eliminated elite teams as well as those that failed to be selected as elite teams in the past. The teams still in the competition are Upstage, SK Telecom, and the LG AI Research Institute consortium.

Motif Technologies previously failed to make the cut when the government narrowed the national AI project competition teams to five. Motif Technologies said, "We are the only domestic startup with experience developing both high-performance LLMs (large language models) and LMMs (large multimodal models) as foundation models," adding, "We will work to prove the originality of Korea's technology."

Motif Technologies said the large language model (LLM) "Motif 12.7B," unveiled in Nov. last year, was developed entirely with domestic technology from model building to data training. In particular, instead of using the transformer architecture as is, the company said it independently developed and applied a "grouped differential attention (GDA)" technology. It said this goes beyond simply training a model from scratch, redesigning the attention function—the core of intelligence—and the model architecture itself, and is being highly rated for originality.

Despite being a relatively small 12.7B-parameter LLM, the model scored higher than large LLMs such as Mistral Large 3, which has 675B parameters, on the Artificial Analysis Intelligence Index (AAII), a comprehensive assessment of global AI model performance.

※ This article has been translated by AI. Share your feedback here.