If an intimate image is shared online without your consent, you can demand that it’s removed within 48 hours, under a new law to protect women and girls.
Today (18 February), the government has announced that tech platforms must detect and remove intimate images shared without consent – and that platforms have 48 hours to remove such material once it has been flagged.
The timing is significant. Nearly two weeks ago, I stood outside 10 Downing Street as Jodie*, a survivor of deepfake abuse, handed in a petition, launched in partnership with Glamour, EVAW, Professor Clare McGlynn, and Not Your Porn, calling for a dedicated Image-Based Abuse Law, which would, at a minimum, introduce 48-hour takedown orders against tech companies who host so-called ‘revenge porn’ or deepfake abuse.
In an exclusive statement to Glamour, Tech Secretary Liz Kendall said, “I want to thank Glamour and the survivors and campaigners who have fought tirelessly for change. You called for rapid 48-hour takedowns and we have listened and we have acted, your voices have been heard.
“The days of tech firms having a free pass are over. We are determined to make the internet a space where women and girls can feel safe, respected and protected.”
Jodie* tells Glamour, ““For too long, victims of intimate image abuse have been told to be patient while sexually explicit photos of them circulate, reappear, and re-traumatise. The promise today from government that images must be removed within 48 hours – and that survivors will only have to report their images once – is something campaigners in the UK and around the world have fought for over many years.”
“Now, time will tell whether platforms are truly held to account when they fail to comply, and whether the government follows through on its commitment to enforcement, including blocking access to sites that refuse to act.
“I hope these protections extend beyond mainstream social media platforms to chat sites, messaging services, and any space where this abuse can be perpetrated. We know that abusers do not respect platform boundaries and T&Cs, so the response must reflect the realties of this.”
“This is an important step forward, but for survivors, safety will only be realised when the law is enforced consistently, platforms are proactive, and the burden no longer falls on victims.”


