I was deepfaked by my best friend – the government must learn from survivors like me

0
34
I was deepfaked by my best friend – the government must learn from survivors like me


We cannot afford to make the same mistakes we’ve seen in other areas of legislation, where vague language or loopholes leave survivors fighting for justice long after the laws have passed.

When I first discovered that someone close to me had solicited the creation of deepfake images and videos of me, it shattered my sense of self and safety. It was a devastating reminder that this abuse can be perpetrated by anyone, for any reason. And while my personal experience has driven my fight for change, the issue goes far beyond my story.

This is not just about protecting the women who are suffering now, it’s about ensuring that future generations don’t have to live in fear of how their images might be weaponised against them. If the government is going to take its time to draft this legislation, then it must stand the test of time. It must be comprehensive, unambiguous, and capable of addressing not only today’s threats but those of tomorrow.

We’ve already seen what happens when legislation doesn’t keep up with the pace of technology. Much of the Online Safety Act, which promised to make the internet safer, was outdated before it even came into force. We cannot let that happen again.

It’s also crucial to ensure that any new legislation works in tandem with measures to hold tech companies accountable. These platforms profit from hosting and distributing this abuse, yet too often, they face little to no consequences. Survivors are left to fight uphill battles to have content removed, while perpetrators exploit the lack of oversight to harm more women.

The government must take a holistic approach:

• Explicitly criminalising solicitation to prevent dangerous loopholes.

• Defining “intimate images” broadly enough to include non-nude manipulated images.

• Funding specialist support services for survivors.

• Launching education campaigns to tackle the societal stigma that unfairly places shame on women instead of perpetrators.

This announcement is a step in the right direction, but it’s not the end of the road. Until this legislation is passed, deepfake abuse remains legal, and more women will be harmed every day. The government must act urgently, but it must also act thoroughly.

For me, this fight isn’t just about changing the law – it’s about changing the conversation. We need to stop treating intimate image abuse as a private issue or a “risk” women must navigate. This is not about poor choices or bad luck. This is about power, control, and the deliberate exploitation of women’s bodies, by men.

I want to live in a world where no woman has to feel the devastation I felt when I discovered my image had been manipulated without my consent. A world where survivors are supported, not shamed. And a world where the laws don’t just reflect the times, they lead them.

To the government, I say this: Thank you for taking these steps, but don’t stop here. Survivors need you to be bold, decisive, and unrelenting in your commitment to ending this abuse. Let’s make this legislation future-proof, consent-based, and survivor-focused. Anything less would be a betrayal of those who are already suffering and those who will be in the future if we fail to act.

GLAMOUR is campaigning for the government to introduce an Image-Based Abuse Bill in partnership with Jodie Campaigns, the End Violence Against Women Coalition, Not Your Porn, and Professor Clare McGlynn.

*Names and some details have been changed to protect victims and survivors’ identities and safety.

Revenge Porn Helpline provides advice, guidance and support to victims of intimate image-based abuse over the age of 18 who live in the UK. You can call them on 0345 6000 459.



Source link