Menu Menu
[gtranslate]

Can social media ever tackle it’s body shaming issue?

Backlash against TikTok’s new ‘chubby filter’ suggests so – but is it enough? 

TikTok’s algorithm may feel chaotic and unpredictable. One minute you’re scrolling past a hotel recommendation, the next a meme about your overbearing boss, then a tour of the M&S aisle. But apps like these have built dedicated algorithms that feed us content they think we want to see.

Everything we share, like, and linger on feeds this system information about our preferences – so the next piece of content we see has been served up especially to keep us hooked. Most of us are aware of this rather insidious truth about social media by now. But it doesn’t raise alarm bells as often as it should, especially when the images that reach viral status have deeply damaging effects.

The most recent TikTok trend to infiltrate hundreds of thousands of feeds is a controversial AI filter that shows users what they might look like if they were to gain weight. The app generates a side-by-side before-and-after to determine a so-called ‘chubby’ transformation, which users have been resharing in droves.

But reactions have been mixed. While some laughed at an ostensibly harmless piece of AI fun, others – particularly women influencers and those in the plus-size community – were concerned.

Since then, TikTok has removed the filter from the app, with pressure from users and the medical community warning it could promote ‘toxic diet culture’ and body dysmorphia.

‘It’s just the same old false stereotypes and tropes about people in larger bodies being lazy and flawed, and something to desperately avoid,’ said Dr Emma Beckett, a food and nutrition scientist.

‘The fear of weight gain contributes to eating disorders and body dissatisfaction, it fuels toxic diet culture, making people obsess over food and exercise in unhealthy ways and opening them up to scam products and fad diets.’

It may seem like a stretch, for a silly AI filter to have such detrimental and widespread impacts on our mental health. But the reality is that it’s been happening for decades. Long before TikTok even existed, we’ve been fed images of certain body types to promote a distinct and limiting standard of beauty.

This shapes our everyday lives in ways we hardly notice – the clothes we choose to wear, the food we eat, and the way we perceive and treat those whose bodies don’t fit within these limiting ideals. The media we consume ultimately informs the socio-cultural hierarchies we implement within our community.

It’s the reason fat people are persistently ridiculed and excluded from countless conversations and spaces – often through microaggressions like a lack of size inclusivity in popular clothing stores, or the lack of larger bodies in mainstream media. Even the size of the seats on public transport.

TikTok’s ‘chubby filter’ isn’t the first – and I doubt the last – example of AI fueling body dysmorphia online. Filters that slim, sculpt, smooth, and ‘beautify’ have been around for years. And the messaging is always: thinner = better.

Thanks to the algorithm, once we see people using these filters we’re fed them ten-fold. Suddenly you’re scrolling through videos of people’s diet routines, body transformations, even ‘thinspiration’ content disguised as fitness motivation.

Of course, AI isn’t the root of the problem. It’s just the latest tool being used to reinforce beauty standards that have existed for centuries. But what’s particularly alarming is how seamless AI makes it. These aren’t just filters anymore; they’re full-blown digital manipulations, blurring the line between reality and fiction.

As the ‘chubby filter’ gained traction, many users expressed concern that plus-sized individuals would see the content and feel like the ‘butt of the joke’. One user, Emma, was among those that felt the filter was inherently damaging and needed to be removed.

‘My first thought when [I saw it] was how damaging that would be. People were basically saying they looked disgusting because they were ‘chubby’ and as a curvier woman, who is essentially the ‘after’ photo on this filter, it was disheartening for me,’ Emma told the BBC.

Activist Demi Lynch, who spoke out against the filter on her social media platforms, said it reinforced ‘the notion that being fat is a bad thing and it’s something we should be ashamed of; it’s something we should fear.’

TikTok’s subsequent decision to pull the filter, which was uploaded by CapCut (a platform owned by TikTok’s parent company ByteDance), highlights how social media has the capacity to channel real-world change.

TikTok user Sadie, who has over 60,000 followers on the app, said she was relieved at TikTok’s decision. ‘I’m happy that [they] did that, because ultimately social media should be a fun, lighthearted place, not somewhere where you get bullied for how you look.’

But in the context of the algorithm, which is always growing and changing, this win feels like a small one.

Social media platforms profit from insecurity. The more we compare, the more we engage. The more we engage, the longer we stay on the app. And the longer we stay, the more money they make. It’s a cycle that won’t break unless companies actively prioritize user well-being over profit – which, let’s be honest, isn’t likely.

Sustained pressure is a powerful tool, but we also need regulation and tangible policies that hold platforms accountable on a regular basis.

And on the individual level, we have to keep questioning the content we consume. After all, just because a filter exists, doesn’t mean we have to use it.

Accessibility