NC AI laid out domain expertise, security and control, and expense efficiency as conditions to spread generative artificial intelligence (AI) to industrial sites, and unveiled VAETKI, an industry-specialized foundation model that meets those conditions, along with a field deployment framework. A foundation model refers to a basic backbone AI that can be used commonly across various tasks.
Lee Hyeon-su, NC AI chief executive officer (CEO), said on the 30th at the first briefing of the independent AI foundation model project held at the COEX Auditorium in Samseong-dong, Seoul, "A bigger model is not always the answer. We need to secure real-time response speed with a model optimized for specific businesses."
He then emphasized that for industry-specialized AI to become a technology used in the field, not only technical performance but also security, operations, and expense must be designed together. The first requirement Lee presented is domain expertise and flexibility. Because each industry—such as manufacturing, logistics, and defense—has different terminology, regulations, and workflows, he said industry data must be integrated in real time to understand on-site context.
The second is security and control. Lee said, "In an on-premises environment, we must prevent confidential information from leaking outside corporations." On-premises is a method of running AI on a client's internal servers rather than an external cloud. Lee emphasized, "The core is 'expense efficiency' that matches the quality and speed demanded by industry at a realistic expense."
Based on this, NC AI said it exceeded its targets by 100% as a result of phase one. Specifically, it secured high-quality, Korea-specific industrial data and successfully completed development of a 100B (100 billion)-parameter large language model (LLM).
The model strategy focused not only on high performance but also on efficiency and scalability. NC AI said it built a lineup aimed at configurations that can be deployed immediately to the field, including a high-performance LLM and a diffusion-type LLM advantageous for field deployment, and in parallel carried out reinforcement models for expense optimization and flexible response.
Technically, it said it applied a mixture-of-experts (MoE) architecture. MoE is a structure that selects and uses the necessary parts among multiple expert modules, and the company said it designed the approach to gather specialized knowledge and enable scalability. According to the presentation, it reduced memory usage by about 83% and improved the computation efficiency of attention—the operation by which AI finds important cues in context—by 40%.
On the data side, it said it built hyperscale data of 20 trillion tokens. A token is a unit by which AI processes text, and more tokens mean the AI trained on more massive text. NC AI said it deepened the model by securing 14 types of multimodal data that include not only domain documents but also medical and safety data. It also said it strengthened safety and ethics with Multimodal AI safety alignment data.
NC AI proposed a DomainOps platform as a framework not just to finish with model development but to deploy it on site and make it work well. DomainOps is an operations framework that helps easily develop, tune, and deploy models tailored to industry-specific environments, and includes features such as automatic resource allocation and scheduling, fine-tuning, one-click deployment, and on-premises model downloads. The company also said the platform was recognized at the WITS 2025 international academic workshop.
NC AI added that based on this framework, it is carrying out more than 28 diffusion projects for industrial sites in manufacturing, logistics, defense, and other areas. Specific examples included Smart Factory transformation, applications in aviation and construction, and cultural content projects. It said the all-around projects are proving VAETKI's performance and applicability.
That day, Lee introduced VARCO 3D as a real service case. By combining the VAETKI LLM with a 3D generation model, it can generate 3D from text alone and implement even complex shapes and textures. In particular, it shortened a 3D production process that took more than four weeks to within 10 minutes, and said it reached 20,000 monthly active users (MAU) as of December.
As another example, the company presented VAETKI LLM with sound generation. When a specific scene is described in language, the LLM infers the needed sounds and automatically generates high-quality sound effects, and it said it can generate an unlimited variety of sounds such as background music, sound effects, and character sounds.
NC AI also unveiled a step-by-step strategy going forward. In phase two, it aims to achieve a 200B high-performance LLM, and in phase three, it said it will develop a diffusion-type LLM for industry specialization. In phase four, it plans to fully activate industry diffusion through multiscale and multimodal packages.
As a mid- to long-term timeline, it also presented a plan to complete demonstrations in Korea's leading industries in 2026, and in 2027 pursue global AI safety and reliability standard certifications while ramping up exports of sovereign AI to new markets in the Middle East and Southeast Asia.
Meanwhile, the government will conduct a first-stage evaluation in January to comprehensively review the performance and future plans of teams participating in independent AI foundation model development, and will narrow down to four elite teams based on the results. It will then hold reviews every six months to reduce the number of elite teams by one each time and, in 2027, finally select two teams.