This feature references graphic depictions of violence and death.
When Ellie*, a social media exec from London, scanned her personal social media accounts this morning, she didn’t notice anything out of the ordinary. Her feed consists of “fashion creators and fashion/clothes adverts, recipes and eating out recommendations in London, relationship memes and comedy skits, left-wing politics, and Black history.”
But when her partner, Rob*, an engineer, goes on social media, it’s a different story. He describes seeing “graphic content of people being injured”, including people getting run over or having their fingers chopped off. It’s especially bad on X, where he regularly sees footage of people appearing to be killed. “People with their guts hanging out… people being shot dead,” he explains. Pornography, including videos of prisoners appearing to have sex with prison guards, is also a regular occurrence on his ‘For You’ feed.
Rob is not the only man being bombarded with such extreme content. A new BBC Panorama documentary suggests that men and boys are being pushed violent and misogynistic content on Instagram and TikTok – without deliberately searching for or engaging with it.
BBC Panorama spoke to Cai, now 18, about his experiences with this disturbing content on social media. He says that it came “out of nowhere” when he was 16: videos of people being hit by cars, influencers giving misogynistic speeches, and violent fights.
It comes amid growing concerns that boys and young men are being radicalised online by ‘misogyny influencers’ like Andrew Tate. It’s one thing for boys to actively engage with violent and misogynistic content, but what hope do we have if they’re being pushed it by their own social media algorithms?
Let’s rewind for a second. What are social media algorithms and how do they work? “Social media algorithms determine what content you see in your feed by analysing your behaviour and interactions on the platform. They collect data on what you like, share, and comment on, who you follow, and how long you view content. This data helps the algorithm rank content based on its likelihood to engage you,” explains Dr Shweta Singh, associate professor at the University of Warwick.
Essentially, your social media algorithm should be directing you towards content that you actually want to see based on the content you’ve previously interacted with. So, the theory goes that when someone ‘likes’ or watches violent or misogynistic content, their social media algorithm will respond accordingly – often directing users towards increasingly extreme content to keep the user engaged.
Dr Brit Davidson, an Associate Professor of Analytics at the Institute for Digital Behaviour and Security at the University of Bath’s School of Management, explains: “Any group that can be discriminated against can be marginalised further online, as these biases found in data and user behaviour essentially reinforce the algorithms.
“This can create self-perpetuating echo chambers, where users are exposed to more content that reinforces and furthers their beliefs. For example, someone who engages with ‘pickup artist’ (PUA) content (content created to help men ‘pick up’ women, known for misogyny and manipulation) may keep viewing misogynistic content and even be exposed to extreme misogynistic content, such as involuntary celibate, ‘incel’, groups, which can lead to dangerous behaviour both on- and offline.”

