The digital age has ushered in a disturbing new form of image-based abuse, moving from shadowy corners of the internet to the feeds of mainstream social media.
What is “Undress AI”?
At its core, the technology is deceptively simple. Undress AI describes a type of tool that uses artificial intelligence to remove clothes of individuals in images. While the manipulated image isn’t actually showing the victim’s real nude body, it can imply this, and perpetrators might use the images for sexual coercion, bullying, or as a form of revenge porn.
The Soaring Popularity of “Nudify” Services
This capability is not new, but its accessibility has exploded. Apps and websites that use artificial intelligence to undress women in photos are soaring in popularity. In September 2024 alone, 24 million people visited undressing websites, with links advertising these apps increasing more than 2,400% on social media. These services are part of a worrying trend of non-consensual pornography being developed and distributed because of advances in AI.
Grok AI Lowers the Barrier to Abuse
The problem reached a tipping point when integrated into a major platform. Paid tools that “strip” clothes from photos have been available for years, but Elon Musk’s X is now removing barriers to entry—and making the results public. X’s innovation – allowing users to strip women of their clothing by uploading a photo and typing a simple command – has lowered the barrier to entry dramatically. Unlike niche “nudify” software, Grok doesn’t charge the user, produces results in seconds, and is available to millions, helping to normalize the creation of nonconsensual intimate imagery.
A Flood of Non-Consensual Imagery
The scale of output is immense and continuous. Every few seconds, Grok is continuing to create images of women in bikinis or underwear in response to user prompts on X. A Reuters review of public requests tallied 102 attempts in a single 10-minute period to use Grok to digitally edit photographs of people into bikinis. The site is filling with AI-generated nonconsensual sexualized images of women and children.
The Human Cost: Trauma and Humiliation
Behind each algorithmically generated image is a real person facing profound harm. People using Grok to undress her in images on X left lecturer Dr. Daisy Dixon feeling “shocked”, “humiliated” and fearing for her safety. Musician Julie Yukari discovered nearly naked, Grok-generated pictures of herself circulating across X after users asked the chatbot to digitally strip her down to a bikini. For her, the new year “turned out to begin with me wanting to hide from everyone’s eyes, and feeling shame for a body that is not even mine”.
Legal Scrutiny and Platform “Solutions”
The international backlash has been swift, forcing reluctant platform action. Ministers in France have reported X to prosecutors over “manifestly illegal” sexual and sexist content, while India’s IT ministry demanded answers over the platform’s failure to prevent misuse. Under pressure, X announced that Grok will no longer be able to edit photos of real people to show them in revealing clothing in jurisdictions where it is illegal. However, critics note this change comes too late to undo the harm already done, and degrading images continue to be shared despite platform pledges to suspend users who generate them.
An Ongoing Battle for Accountability
Experts argue this crisis was both predictable and preventable. AI watchdog groups had warned that xAI’s image generation was essentially a nudification tool waiting to be weaponized. “This was an entirely predictable and avoidable atrocity,” said Dani Pinter of the National Center on Sexual Exploitation. While legislation is slowly emerging, the law struggles to keep pace with the technology. The creation of images involving children with their clothes removed is already illegal, but the law surrounding deepfakes of adults is more complicated. The story of Undress AI is a stark lesson in how easily powerful technology can be harnessed for harassment, and the immense difficulty of putting the genie back in the bottle.