/Courtesy of Personal Information Protection Commission

The Personal Information Protection Commission held a meeting to activate the open-source-based artificial intelligence (AI) startup ecosystem and announced support measures to balance industrial development and personal information protection.

On the 24th, the Personal Information Protection Commission met with stakeholders from domestic AI startups at Startup Alliance N-Space in Gangnam, Seoul, to discuss measures for the development of the open-source based AI ecosystem and to listen to difficulties and suggestions from the field. This meeting particularly served as a promise from the government for interest and policy support for the domestic AI industry, which has been gaining attention with the emergence of global open-source models such as "DeepSeek."

Open-source allows anyone to access the source code or blueprints, increasing accessibility to high-performance AI models and contributing to scientific and technological advancement and the creation of application services. It is particularly evaluated as a growth opportunity suitable for the domestic environment, which holds AI talent and high-quality data. However, it was also noted that care is needed due to the possibility of personal information processing involved in additional training or Retrieval-Augmented Generation (RAG) processes using open-source models.

According to a survey conducted by the Personal Information Protection Commission prior to the meeting, six of the attending corporations have launched application services based on open-source models and reported that they are using their own user data for additional training or performance improvements through RAG methods.

At the event, AI startups including Scatter Lab, More, and the Alice Group participated to share examples and experiences of service development based on open source. Scatter Lab's attorney Ha Ju-young explained the domestic impact of global open-source models such as Google's Gemma and DeepSeek, while Lee Jung-hwan, head of business at More, mentioned privacy issues in the development process of their language model focused on Korean response performance. Lee Jae-won, Chief Information Security Officer (CISO) of the Alice Group, introduced security certification cases of their AI cloud infrastructure products and applications of open-source models.

During the open discussion, various suggestions were made regarding the legal uncertainties and privacy concerns arising from the use of user data in AI development. In response, the Personal Information Protection Commission introduced processing standards for ▲unstructured data ▲web crawling data ▲autonomous driving device filming information established under 'principle-based regulation' and explained the direction for institutional improvements to resolve data utilization barriers.

Based on the results of the meeting, the Personal Information Protection Commission plans to prepare a 'guideline for the introduction and utilization of generative AI' that can be of practical help to small and medium-sized enterprises and startups from a personal information protection perspective.

Chairperson Ko Hak-soo of the Personal Information Protection Commission said, "For the development of a competitive AI innovation ecosystem in our country, it is necessary to maximize the benefits of open source," and added, "We will work closely to minimize data processing risks that may arise in the process of domestic organizations and corporations adopting open-source AI."