/Courtesy of Twelve Labs

The Korea Committee for UNICEF has established a system to centrally manage its dispersed large volumes of video and photo materials using artificial intelligence (AI). Through an AI archive built by Twelve Labs, it converted about 8 terabytes (TB) of unstructured data into digital assets, and cut search time by about 95% compared with before.

Twelve Labs said on the 1st that it has completed building an AI archive that allows the Korea Committee for UNICEF to systematically manage and use its video and photo data. The project was pursued to consolidate materials that had been stored separately on personal computers (PCs) and network-attached storage (NAS), among others, to improve searchability and utilization.

The system applies Twelve Labs' "Video Native" technology, which understands videos by the flow of time and units of situational context. It goes beyond simple keyword searches by analyzing the relationships among people, actions, objects, and backgrounds in the footage to structurally grasp scenes.

Accordingly, staff can find the desired segments by entering natural-language sentences such as "a scene of children drinking water at an African water site." Search results present the relevant video segments along with timestamps.

Chief Executive Lee Jae-sung of Twelve Labs said, "We will expand collaboration so that institutions and corporations with large volumes of unstructured data can draw meaningful value from their video assets."

※ This article has been translated by AI. Share your feedback here.