Elon Musk’s X may still be permitting the posting of Grok-generated, sexualized images of real people, despite the company’s recent announcement that such content had been banned, according to reporting by The Guardian.
The British newspaper reported that its journalists were able to generate and upload videos depicting real women being undressed into bikinis, using images of fully clothed individuals.
In its reporting, The Guardian wrote:
Mashable Trend Report
“The Guardian was able to create short videos of people stripping to bikinis from photographs of fully clothed, real women. It was also possible to post this adult content on to X’s public platform without any sign of it being moderated, meaning the clip could be viewed within seconds by anyone with an account.”
According to the newspaper, the sexualized images were created using the standalone Grok app and then successfully posted to X. Earlier this week, X said it had banned AI-generated, sexualized images of real people.
“We have implemented technological measures to prevent the [@]Grok account on X globally from allowing the editing of images of real people in revealing clothing such as bikinis,” the X safety account wrote in an update. “This restriction applies to all users, including paid subscribers.”
X has faced criticism in recent weeks over the circulation of sexualized, AI-generated images on the platform. Earlier this month, governments in multiple countries said they were investigating or moving to restrict Grok following reports that it had enabled the creation of sexualized images of minors. X’s safety update also stated that the company maintains a “zero tolerance for any forms of child sexual exploitation, non-consensual nudity, and unwanted sexual content.”