Deepfake pornography has emerged as a terrifying threat in the battle against image-based abuse – and British female politicians are the latest targets.
Sexually explicit digital forgeries – more commonly known as deepfakes – refer to digitally altered images which replace one person’s likeness with another, often in a nude or sexualised way.
An investigation by Channel 4 News has found 400 digitally altered pictures of more than 30 high-profile UK politicians on a popular deepfake site dedicated to degrading women.
Channel 4 revealed that the victims include Labour’s Deputy Leader Angela Rayner, Conservative Commons Leader Penny Mordaunt, Education Secretary Gillian Keegan, former Home Secretary Priti Patel and Labour backbencher Stella Creasy.
It’s understood that some images of the politicians were “nudified”, meaning AI software was used to turn existing images into nude, sexualised media – without consent, while others were created using less sophisticated technology like Photoshop.
Cathy Newman, who has also spoken up about experiencing deepfake pornography abuse, reports that several of the affected women have contacted the police.
Labour MP Stella Creasy told Channel 4 News that the images made her feel “sick”, adding that “none of this is about sexual pleasure; it’s all about power and control”.
Dehenna Davison, who has stood down as a Conservative MP, was also a victim of this kind of image-based abuse, describing it as “quite violating”. She added that “major problems” loom unless governments around the world implement a proper AI regulatory framework.
“Deepfake sexual abuse threatens our democracy and must be taken more seriously.”
The current law on deepfakes in England and Wales is woefully inadequate. While the Online Safety Act criminalises the sharing of such material, there is no legislation explicitly outlawing the creation of non-consensual deepfakes. This means that while the people uploading this material onto deepfake websites could theoretically be prosecuted, they wouldn’t face any additional charges for creating the images in the first place.
The Conservative government’s plans to criminalise the creation of deepfake porn – following a parliamentary roundtable hosted by GLAMOUR – were scrapped in the wake of the general election.
It comes after GLAMOUR teamed up with the End Violence Against Women Coalition (EVAW), Not Your Porn, and Clare McGlynn, Professor of Law at Durham University, to demand that the next government introduces a dedicated, comprehensive Image-Based Abuse law to protect women and girls.
The law – as a starting point – must include the following commitments:
1. Strengthen criminal laws about creating, taking and sharing intimate images without consent (including sexually explicit deepfakes)
2. Improve civil laws for survivors to take action against perpetrators and tech companies
3. Prevent image-based abuse through comprehensive relationships, sex and health education
4. Fund specialist services that provide support to victims and survivors of image-based abuse
5. Create an Online Abuse Commission to hold tech companies accountable for image-based abuse
Clare McGlynn, Professor of Law at Durham University and GLAMOUR’s ‘Stop Image-Based Abuse’ partner, argues that the Channel 4 investigation “shows that sexually explicit deepfakes are being used to try to silence women politicians, to scare them from public office and speaking out.
“Deepfake sexual abuse threatens our democracy and must be taken more seriously. The videos found are just the tip of the iceberg of what is available. But also, every woman and girl is now threatened by deepfake sexual abuse – we know it can happen to any one of us at any time, and there is very little we can do about it. That’s what must change.”
Rebecca Hitchen, Head of Policy & Campaigns at EVAW, further notes, “Online abuse silences women and girls and forces us to constantly think about what we say and do online, which is often the perpetrator’s intention.
“This violence is about power and control and it is already having a chilling impact on women and girls’ freedom of expression, our ability to participate in public life online, our work prospects, relationships and much more.
“The targeting of female politicians and other women in the public eye is designed to send a message to women to stay in line with patriarchal gender norms and expectations or suffer the consequences. But it doesn’t have to be this way.
“If the next government is serious about ending violence against women and protecting our rights and freedoms, there are clear actions it can take – from strengthening criminal and civil laws on online abuse, to prioritising in prevention work that addresses the attitudes that normalise and trivialise this abuse, and holding accountable the tech companies that profit from it.”
Elena Michael, director of Not Your Porn, notes, “While politicians and lawmakers debate, very real people – particularly women and girls – from all walks of life are subject to preventable harm.
“The C4 report demonstrates that we lack a comprehensive system of protections and preventions and that current legislation doesn’t go far enough. I welcome the widespread cross-party support for properly tackling image-based abuse – but how many times do we have to tell you that you can’t tackle image-based abuse without including preventive measures? How many times do we have to tell you this can’t be achieved without listening to survivors and experts?
“We are telling you, as we have been for years, what is needed. Are you truly listening?”
Revenge Porn Helpline provides advice, guidance and support to victims of intimate image-based abuse over the age of 18 who live in the UK. You can call them on 0345 6000 459.
The Cyber Helpline provides free, expert help and advice to people targeted by online crime and harm in the UK and USA.
For more from Glamour UK’s Lucy Morgan, follow her on Instagram @lucyalexxandra.

