What’s more, sites like X and TikTok regularly host ads for these apps in the UK, with ads appearing in users’ feeds without engaging with any type of related content. Elsewhere on the internet, Reddit, Facebook, and Quora host threads where users discuss the best types of nudify apps, the best features, and value for money and freely debate the efficacy of undress apps, sharing links and critiques with one another.
And despite removing all AI-generated content from its site, digital behemoths like PornHub still proffer search results like “Nudify porn”, “deep nude app videos” and “watch free nudify videos online”. PornHub has also come under fire for hosting videos by creators advertising apps like Clothoff, but confirmed that it removes any content of this kind and that these ads are not permitted on its sites.
Whether promoting or platforming these apps in the UK, tech companies are essentially enabling sextortion and intimate image abuse. A 2023 report from the Revenge Porn Helpline shows that sextortion cases have increased by 54% compared to the previous year, with 28 times more women affected than men.
“The Helpline has observed a growing trend with the emergence of AI technology in publicly accessible apps, allowing users to create realistic synthetic images quickly and easily. This harmful use of technology presents a fresh risk and form of intimate image abuse, demanding proactive measures to prevent the exploitation of AI technology for such purposes.” said a spokesperson.
“It’s concerning as it’s easier than ever to access these sorts of apps and more and more are being created,” says Becca. “But tech platforms need to take responsibility, they may not be making these deepfake tools, but they are giving people a way to discover and use them. This kind of blackmail, especially sextortion, really thrives on silence and shame. So I’d advise people to talk to someone, anyone. If not a close friend or family member, one of the helplines that now exist for this kind of scam.”
“You might know it isn’t a real image but that doesn’t mean anyone else will believe it is fake.”
Becca chose to take to social media and voluntarily shared the images attached to the email she’d received from her blackmailer. “I wanted to take any of the power away from the scammer. I didn’t like feeling threatened and felt like I had two choices: hide away or say f*ck you.”
“I also wanted to show other people that this can happen. The more I’ve learned about other people’s experiences, the happier I am that I shared and talked about it right away. I’ve had people tell me they’re glad I shared because they’ve had conversations with their kids and teens and other people have said seeing me go through it has been good to consider for if anything ever happens to them – they said it would feel a little less scary and shocking.” she tells GLAMOUR.
“Society stigmatises women for their sexuality; therefore, the damage these images can do can extend to things like being fired, preventing you from being hired or fall-outs with partners and parents. You might know it isn’t a real image but that doesn’t mean anyone else will believe it is fake.” explains Jess.
“We cannot afford for our laws to fall so far behind when it comes to technology because it is women and girls who are harmed. The internet is a gendered experience and we need laws that protect women and girls online.”
Data from the Home Security Heroes report also showed that 98% of all deepfake content in 2023 was of a sexual nature and 99% of the people targeted in that content were women. This is an important detail when it comes to “undress” apps. The technology used to generate these images is trained to create nude images with breasts and vulvas, so feeding the apps a photo of a clothed cisgender man will still result in an AI-generated nude with a vulva and breasts.
“We are facing a future where every woman will have a fake nude image of her existing online.”
“We are facing a future where every woman will have a fake nude image of her exist online, and that should never be normalised,” says Jess. “I have seen boys request fake nudes of their teachers and mothers online. The ease of access of this technology means men and boys can see anyone they desire naked and I worry about the entitlement over women’s bodies that could spill over into our physical world.”

