(Looking at you sleeping in the room, sitting on the bed) Are you still asleep?
On the 23rd, as a reporter selected a man in the Chinese virtual lover chatbot service 'WOW' based on generative AI and opened the chat window, such words flowed out in a young man's voice. The same content was displayed in text on the screen, with descriptions of the man's actions in parentheses, allowing for an imagined view of him. This AI lover expressed affection, responding to the request 'Introduce yourself' with '(Facing each other close enough to feel each other's body heat) I am your husband, Namsung (倪钦毅).' The app displayed a message saying, 'This young gentleman originally had no feelings for his arranged marriage counterpart, but (depending on your actions) the situation can change.'
Concerns are rapidly increasing regarding adolescents addicted to AI chatbots as the AI chatbot market in China is growing quickly. Since there are no age or time restrictions, more and more teenagers are unable to detach from AI chatbots and are exposed to explicit and violent expressions during conversations. Long conversations with AI can lead to the formation of distorted values, and even monetary losses such as subscription fees are occurring. There are calls for stricter regulations, similar to those applied to online games that can only be used for certain hours on weekends.
Chinese economic media Chailian reported, 'AI chat software focused on emotional connection is quietly penetrating the youth sector,' sharing the story of a man named Li who has a sixth-grade daughter. His daughter recently became deeply immersed in an 'AI boyfriend,' and the level of conversation rivals that of romance dramas. He was shocked to learn through conversations with other parents that more than half of his daughter’s classmates are hooked on AI chat apps. Li noted, 'It is more addictive than WeChat (China's national messenger)' and added, '(I have) deleted the AI chat app several times, but each time the child reinstalls it. This app knows how to manipulate children.'
The AI market specialized in emotional exchanges is rapidly growing in China. According to tech outlet 36kr, the size of the 'emotional companion' AI market in China is projected to grow more than fifteenfold, from 3.866 billion yuan this year to 59.56 billion yuan by 2028. 36kr noted, 'Many young people are publicly sharing their AI lovers on social media such as Xiaohongshu (Chinese Instagram)' and stated, 'While the concept of virtual lovers has existed for a long time, the recent explosive advancement of large language model AI (AI that speaks naturally like a human) has significantly increased the number of users forming relationships with AI.'
There are various ways to interact with AI. Users can either set the AI's name and image to converse or choose from various virtual characters available in the app. Some apps provide specific scenarios within which users role-play. For instance, in the AI chatbot app 'Xiaoyeqin (小夜曲),' users can encounter diverse interactions with characters such as a doctor they bump into inside a hospital, a male student they meet during a cosplay festival, or a chaebol chairman. One Xiaohongshu user commented that the Xiaoyeqin app 'provides stimulating interaction experiences through text, voice, and images,' stating, 'It feels like being with a real romantic partner.'
Chinese media have been focusing on the fact that these AI chatbots neglect youth protection. Firstly, the content of conversations is problematic. While some apps prohibit explicit and violent expressions, most do not have such features. In Xiaohongshu, users can find posts recommending AI chatbots without restrictions on 'sensitive words.' Another issue is that incorrect advice is often given when users share their concerns. The Chinese state media Anguangwang remarked, 'Adolescents are prone to falling into these conversations,' adding, 'In the process, inappropriate words and actions not suitable for their age occur.' It also emphasized, 'This misguides the development of adolescents, disrupts their learning and daily life, and exposes them to harmful language and emotional patterns, leading to distorted perceptions and values.'
Some apps have a 'youth protection mode' feature, but this is also criticized as merely 'window dressing.' Pengpai News reported, 'I used five AI chatbot apps, and you can register just by having a phone number,' noting, 'Some apps ask if you want to activate the youth mode, but they do not verify identity or age even if you refuse.' In contrast to Chinese gaming companies that only allow youth use on weekend evenings from 8 to 9 p.m., there are no restrictions in these cases. Anguangwang stated, 'When explicit AI characters and obscene content arise, youths with limited independent judgment find it difficult to resist the temptation.'
Monetary issues are also arising. Many apps require payment after exceeding a certain time or number of questions. There are also 'pyramid scheme' apps that offer points when users send subscription links to acquaintances. Chinese media Xinyuanchuanbao remarked, 'Most of those who post promotional articles for points are youths, including students,' and stated, 'By simply downloading the app and sharing a basic link, they can entirely bypass youth protection barriers and enter a world filled with risks.'
As the addiction of adolescents becomes a hot topic, there are calls for regulations on AI chatbots that emotionally connect. Lawyer Chen Huan of Longan Law Firm stated, 'Currently, the “Minor Internet Protection Regulations” only require real-name verification for games and live broadcasting platforms, and AI chat software is excluded from mandatory age identification,' he told Chailian. Xinyuanchuanbao called for 'all AI software to implement a real-name system as soon as possible and to strictly differentiate between youths and adults.'