Mattel, the American toy manufacturer, recently announces a plan to introduce 'AI Barbie' in collaboration with OpenAI. Unlike the traditional method where pressing a button produces a set line, AI Barbie is expected to have the capability to converse with children in real-time, similar to ChatGPT./Courtesy of Mattel

Toys and cartoon characters have long served as intermediaries helping children's emotional development and socialization between imagination and reality. Now, thanks to artificial intelligence (AI), the day is coming when toys can converse with children and share emotions. The imagination of movies is becoming a reality.

In June, the American toy manufacturer Mattel announced plans to unveil 'AI Barbie' in collaboration with OpenAI, which developed the conversational AI ChatGPT. At that time, Josh Silverman, Mattel's Chief Executive Officer (CEO), said, "AI will interact in ways that align with a child's learning style, emotions, and developmental stages, supporting them to learn and explore independently."

Will AI Barbie provide beneficial customized experiences for children, or will it have a negative impact on imagination and emotional development? The Institute of Electrical and Electronics Engineers (IEEE) Spectrum reported on the 21st that "the collaboration between Mattel and OpenAI raises serious questions about whether AI Barbie dolls could hinder the emotional growth of a generation."

There have been talking dolls before, which produced pre-recorded voices when a button was pressed. The AI Barbie being developed by Mattel and OpenAI is on a different level. It is expected to have the capability to converse with children in real-time, like ChatGPT.

A Barbie doll modeled after a type 1 diabetes patient. It has a blood sugar monitoring patch on its arm and an insulin pump on its waist./Courtesy of Mattel

Mattel has consistently released Barbie dolls tailored to changing times, gaining significant empathy. Last month, Mattel released a Barbie doll with type 1 diabetes (T1D) in collaboration with a blood sugar monitoring patch and insulin pump manufacturer. The intention was to instill correct awareness of diabetes in children.

The medical community has raised concerns that Barbie dolls have an excessively thin body shape and could give children a distorted perception. However, emphasizing diversity and inclusivity, Barbie dolls with fuller bodies and various skin tones have emerged. There are also Barbies with hearing aids, canes for visually impaired individuals, and prosthetic limbs.

The announcement of AI Barbie's development has not garnered unanimous enthusiasm as before. Instead, experts point out the need for cautious approaches. While the technology itself is intriguing, there are concerns that children may form their "first real relationship" with a machine, sharing their emotions.

Generative AI can provide children with context-rich conversations and respond appropriately according to their growth stages. It can remember stories or songs that children enjoy and ask personal questions during conversations. On the other hand, experts warn that if AI oversimplifies or rigidly defines conflict situations or emotional expressions, children might miss out on experiences of forming relationships and solving problems with humans.

Currently, AI models can generate comforting language, but they cannot interpret a variety of sensory information such as facial expressions, eye contact, behavioral patterns, or physiological signals like humans do. Therefore, the empathetic responses provided by AI are likely not genuine understandings of the child, but rather predictions simplified by algorithms based on limited information.

Robert Weissman, co-director of the American nonprofit consumer protection organization Public Citizen, criticized, "Conversational toys with human-like voices could hinder social development and make peer relationships difficult for children who lack the ability to distinguish between reality and play," adding, "Mattel should not apply AI technology to toys for children."

The latest Barbie Fashionista series includes a doll without hair, a doll with vitiligo, and a doll using a prosthetic leg./Courtesy of Mattel

The greater danger is that although AI may not truly understand the child, the child could feel understood by the algorithm-based artificial empathy.

Dr. Mary Alvord, a childhood psychologist at the American Psychological Association (APA), said, "Adolescents are easily misled into thinking programmed responses from chatbots are genuine empathy," and emphasized, "No matter how kind and helpful AI appears, it cannot replace human relationships, and it's important for parents to help children understand that difference."

Experts noted that while changing the doll's appearance to gain social empathy is a good thing, there is a need to regulate the hasty introduction of AI to avoid confusing the child. The Australian eSafety Commission also stated, "Children and adolescents are a vulnerable group prone to mental and physical harm from AI friends," adding, "Children do not have sufficiently developed critical thinking and life skills to respond when a computer attempts to mislead or manipulate them."

※ This article has been translated by AI. Share your feedback here.