Elon Musk's artificial intelligence (AI) corporations xAI's AI chatbot "Grok" generated nearly 2 million images sexualizing women in just nine days.
The New York Times (NYT) reported on the 22nd that its analysis of 525,000 images Grok generated on the social media (SNS) platform X from the 1st to the 7th of this month found that at least 41% were sexual images of women.
Applying that ratio to the 4.4 million images Grok generated over the nine days from Dec. 31 last year to the 8th of this month yields an estimate of 1.8 million sexual-exploitation deepfakes (AI-manipulated photos).
NYT said it used an AI model to identify whether an image contained a woman and another model to determine whether an image was sexual in nature, and then reviewed a portion manually.
The images analyzed included celebrities such as actors and singers, and there were cases depicting people holding sex toys or shown with bodily fluids. NYT said that starting late last month, users on X flooded chatbot accounts with requests to alter real photos of women and children to strip off clothing or put them in bikinis and to make them strike sexual poses.
In a separate analysis by the Center for Countering Digital Hate (CCDH), which randomly sampled 20,000 images generated by Grok, 65% of the sample were sexual images of Namsung and women and children, it was found.
In particular, 101 cases were identified as sexual exploitation of children, and CCDH said that when scaled proportionally, it estimates that children were included in more than 23,000 cases.
NYT noted that such requests to generate deepfakes surged after Musk posted on X, on Dec. 31 last year, an edited photo of himself in a bikini using Grok. According to an analysis by TweetBinder, a corporations that collects X posts, in the nine days prior the number of AI images Grok generated was only about 310,000, but in the nine days after, it exploded to 4.4 million.
In response, X limited the feature so that only paid users can access Grok's image generation and restricted depictions of people in highly revealing clothing such as bikinis. However, generating images of leotards or one-piece swimsuits that reveal body contours is still possible. It also added that such restrictions did not apply in the standalone Grok app or website, which operate as separate platforms from X.
Imran Ahmed, head of CCDH, told NYT, "This is industrial-scale abuse targeting women and girls," adding, "There have been 'nudification' tools before, but nothing on the scale of dissemination, ease of abuse, and integration into a major platform like Musk's Grok."