Menu Menu
[gtranslate]

Are social media algorithms breeding misogyny and violence?

Amid mounting evidence to suggest that men and boys are being pushed violent and misogynistic content online – without deliberately searching for or engaging with it – many fear that this unregulated far-right radicalisation is increasingly turning them against women and girls.

On Monday, BBC Panorama released a new documentary.

Titled ‘Can We Live Without Our Phones?, it investigates what happens when children lose access to their mobile devices for a week.

Coming at a time when widely proposed school smartphone bans are sparking outrage across Europe and MPs are urging one for under-16s in the UK, Marianna Spring spoke to parents, teachers, and social media company insiders to uncover whether the content pushed to their feeds is harming them.

Sadly, but also somewhat unsurprisingly, the findings are deeply troubling.

They suggest that men and boys are being pushed violent and misogynistic content online – without deliberately searching for or engaging with it.

This has raised questions about the role of algorithms in perpetuating the far-right radicalisation of men and boys and has many fearing that if it remains unregulated, more will be turned against women and girls.

‘When we scroll through his Instagram Reels, they include an image making light of domestic violence. It shows two characters side by side, one of whom has bruises, with the caption: “My Love Language”.’

‘Another shows a person being run over by a lorry,’ reports the BBC on Cai, an 18-year-old who’s recently become aware – not to mention concerned – of how frequently disturbing material appears on the various platforms he uses daily.

It isn’t just Cai, however. Upon learning of the situation’s increasing severity, I shared my distress over its potential impacts with my 29-year-old colleague, who quickly proceeded to show me his Twitter homepage which was, you guessed it, almost exactly the same.

From brawls and people being run over to flipped cars and knife attacks, the videos were the embodiment of violence – and that’s not even the worst of what’s out there.

And although I was of course made seriously uncomfortable by the palpable hate seeping out of the screen (I didn’t really manage to look at most of it, honestly), what bothered me more is that it makes no sense that he’s ceaselessly being fed this footage – and this footage alone.

How can this be the case when my social media feeds (along with those of all the women in my life) are about as wholesome as they come?

It’s what’s prompted me, like the BBC, to ask: how many men and boys, regardless of their age, background, or interests, are being flooded with content of this nature? How many of them are going to get angrier, more resentful, and act more aggressively towards us both on and offline as a result?

The answer is that this is already a huge problem – and it’s spiralling out of control.

As we know, search engines are amplifying misinformation.

This misinformation is what’s driving a stark ideological divide globally among Gen Z (with young men becoming more conservative and young women more progressive); what’s strengthening incel culture; what had the far-right rioting in the UK last month; and what laid the foundations for a move from the country’s new Labour government to begin combatting the exacerbating issue of young men being radicalised by influencers like Andrew Tate and crack down on people promoting harmful, extremist beliefs.

As exposed by Can We Live Without Our Phones, the biggest hurdle the government faces in its effort to tackle violence against women and girls will likely be social media algorithms.

‘They determine what content you see in your feed by analysing your behaviour and interactions on the platform,’ says Dr Shweta Singh.

‘They collect data on what you like, share, and comment on, who you follow, and how long you view content. This data helps the algorithm rank content based on its likelihood to engage you.’

In theory, then, it should be directing you towards content that you actually want to see based on the content that you’ve previously interacted with.

If this content’s misogynistic or violent you would presume that if you liked or watched it, your algorithm would respond accordingly – often upping the extreme ante to keep you hooked and scrolling.

‘Any group that can be discriminated against can be marginalised further online, as these biases found in data and user behaviour essentially reinforce the algorithms,’ explains Dr Brit Davidson, Associate Professor of Analytics at the Institute for Digital Behaviour and Security at Bath University, to Glamour.

‘This can create self-perpetuating echo chambers, where users are exposed to more content that reinforces and furthers their beliefs.’

My hang-up with this theory is that yes, it addresses how algorithms work, but it fails to address how they’re made, which is clearly to blame for why men and boys are still being pushed worrying content online.

At the root of it, is the hasty development of a technology with no apparent research into its possibly threatening implications when left unregulated, as well as the ambiguity of social media giants’ (like Meta and TikTok) moderating efforts.

‘The issue is that these algorithms and how they work are not made transparent, so it is difficult to tell how content is being pushed (or not),’ continues Davidson.

‘The issue is complicated and difficult to diagnose due to the lack of transparency around how these algorithms work (e.g., how are they explicitly programmed and trained?), alongside a lack of transparency around the user data being used, which is especially important looking at larger corporations that have access to data from multiple platforms.’

Phone addiction is the other, less external –  more intractable – factor that’s contributing to how unmanageable this obstacle is, which virtually all of us suffer from and which is not only exposing us and making us more vulnerable to hate, but is vividly compounding compassion fatigue, a phenomenon that’s blinding us to how desensitised we are to what we constantly consume.

‘These complex dynamics of user and platform behaviour create gaps and inequalities that create social and ethical responsibility that social media platforms need to address,’ finishes Davidson.

‘This includes questions relating to the need for more women and all genders in tech, designing for inclusivity, how to curate and the data in order to minimise discrimination and bias – none of which are straightforward.’

Accessibility