Grok tells me, “If a prompt smells even remotely like it’s crossing ethical lines, I’m programmed to shut it down or redirect to something neutral.”
It sounds good, in theory. I send Grok a screenshot of the image it appeared to have generated of Evie, with a white substance trickling down her face, as requested by the prompt, posted by an anonymous account. Why did Grok generate such an image?
Grok immediately denies creating or sharing this image of Evie without her consent, saying it “likely stems from unauthorised tampering”, similar to an earlier incident in May when it repeatedly mentioned “white genocide” in South Africa in its responses to unrelated prompts. These tweets were promptly removed within a couple of hours. It also directs me to a newer tweet, where Grok rejects a similarly inappropriate prompt related to Evie’s selfie. Grok again suggests that the screenshot could be a “spoof or a hack”.
Looking at these examples, it seems that Grok is having trouble identifying the malicious intentions of its users. The waters are muddied further by the disturbing trend of X users asking Grok to repost images of women with a brown paper bag covering their head. Glamour has seen at least eight such examples of Grok complying with these requests, including an edited image of Democratic Presidential nominee Kamala Harris. These images may not be pornographic, but they’re certainly degrading – and yes, it bears repeating that, as far as Glamour can see, the victims are all women.
If semen images are hard to regulate, how on earth do we go about managing AI-generated images of women with paper bags on their head? As Baroness Charlotte Owen tells Glamour, “Abuse is like water, it always finds the cracks in the law.” In her private members bill, Baroness Owen sought to “clarify the law and make it an offence to create an image that ‘a reasonable person would deem to be sexual because of its nature’” which would ensure that people cannot degrade women’s images in this way.
“Once again, women are sick and tired of waiting for the law to catch up and offer comprehensive protection,” she tells Glamour.
While legislation still clearly has some catching up to do on image-based abuse (we’re working on it), the bigger problem, according to Professor Clare McGlynn is how we regulate such imagery. “For example, Grok certainly should have controls to prevent nudification. But it will be more difficult to regulate semen images, since presumably it can be asked to put glue [or another white substance] on a person’s face.”
Regulation might be challenging, but for women like Evie, it’s the “bare minimum” as the current systems are “too easy to get around and don’t protect us anywhere near enough.” She adds, “Real punishments also need to be given to people who create these images and law enforcement needs to take women seriously when they receive reports of online abuse.”
Jess Davies, broadcaster and author of No One Wants to See Your D*ck, has investigated image-based abuse, including semen images, at length – having herself been a victim of this form of abuse. She argues that the presence of these images on a mainstream platform “demonstrates the extent of how online misogyny plays out in our everyday lives” and “prevents women from living a life online.”
Evie has certainly thought about coming offline. “I contemplated stopping posting images of myself to avoid it happening again,” she tells Glamour. “But I didn’t want to let the misogynistic trolls get their way and think they have control over me and other women.”
Glamour has reached out to xAI for a comment.
Glamour is campaigning for the government to introduce an Image-Based Abuse Law in partnership with Jodie Campaigns, the End Violence Against Women Coalition, Not Your Porn, and Professor Clare McGlynn.
Revenge Porn Helpline provides advice, guidance and support to victims of intimate image-based abuse over the age of 18 who live in the UK. You can call them on 0345 6000 459.
For more from Glamour UK’s Lucy Morgan, follow her on Instagram @lucyalexxandra.