This article references assault, image-based abuse, and suicide.
Rapper Megan Thee Stallion has spoken out after a sexually explicit, AI-generated video using her image was shared on social media over the weekend.
“It’s really sick how y’all go out of the way to hurt me when you see me winning.” Megan posted on X, referring to the video. “Yall going too far, Fake ass s***. Just know today was your last day playing with me and I mean it.”
Per NBC News, there were at least 15 posts on X containing the video of Megan – six of which has over 30,000 views each.
A spokesperson for X said the platform’s rules “prohibit the sharing of non-consensual intimate media and we are proactively removing this content.”
Megan fought back tears on stage as she performed her song “Cobra” at her ‘Hot Girl Summer Tour’ Tampa date later the same day. The emotional song details her struggles with her mental health and suicidal ideation after the loss of her parents and grandmother and around the Tory Lanez trial.
Megan has endured consistent harassment on the internet since 2020, when she first accused rapper Tory Lanez of shooting her in the foot. The incident sparked a fierce debate online and Megan became subject to widespread misogynistic hate and death threats.
In a statement to the court during the ensuing trial, Megan stated that she had not experienced a “single day of peace” since she was “viciously shot”.
Lanez has since been found guilty and sentenced to 10 years in prison for the shooting and for three felonies: assault with a semiautomatic firearm, having a loaded, unregistered firearm in a vehicle, and discharging a firearm with gross negligence.
But that hasn’t stopped Megan from continuing to release music and carry on her activism. In a message to her fans ahead of Cobra’s release in November, she stated, “Cobras exemplify courage and self-reliance. They stand tall and fierce in the face of challenges, teaching one to tap into their inner strength and rely on oneself to conquer their threats.”
Megan is not the first famous woman who has been victimised by this sickening content.
In January, Taylor Swift was targeted too – with her face being artificially mapped onto images that depicted her being assaulted in non-consensual sexual acts. One photo depicting Swift was viewed 47 million times before it was removed.
The capabilities of AI technology are becoming a significant concern for women across the world. Indeed, GLAMOUR’s 2023 Consent Survey found that 91% of our readers think deepfake technology poses a threat to the safety of women.
GLAMOUR has previously campaigned for improved legislation around deepfake technology, with the Ministry of Justice pledging to criminalise the creation and distribution of AI-generated and sexually explicit deepfake videos. However, the timing of the general election has created uncertainty over whether this legislation will be honoured by the next government.
There are also limited laws about image-based abuse in the US. For example, USA Today found that only 10 states are known to have laws relating to deepfake videos and images. Because US privacy laws vary depending on the state, there are significant gaps in the legal system surrounding the prosecution of those who create and distribute this content.
Social media companies must urgently address the creation and sharing of such harmful content and should be working in conjunction with governments across the world to stop the spread of this relentless misogyny. If it can happen to the likes of Megan Thee Stallion and Taylor Swift; it can happen to anyone.
The Revenge Porn Helpline provides advice, guidance and support to victims of intimate image-based abuse over the age of 18 who live in the UK. You can call them on 0345 6000 459.