The app has announced major changes to its community guidelines in an effort to protect the wellbeing of users who may be vulnerable to videos glorifying unhealthy behaviour.
A little over a year ago, TikTok began investigating videos promoting starvation following reports that potentially harmful pro-weight-loss accounts were still available in search results.
Today, the viral video-sharing platform thatβs wildly popular amongst young people (60%Β of its users are between the ages of 16 and 24) has announced major changes to its community guidelines in an effort to further protect the wellbeing of users who may be vulnerable to content that glorifies eating disorders.
This is because, as we know all too well by now, social media can be incredibly triggering for those suffering from a challenging relationship with food, diet, or body image.
To put this into perspective, from April to October 2021, the NHS saw hospital admissions for anorexia, bulimia, and other conditions in teenagers rise by 41%, a disturbing spike that experts believe is linked to the pandemic pushing much of our lives and interactions online.
TikTok is of particular concern, given its ultra-specific algorithm that is tailored to each individual userβs in-app behaviour. This feature sets it apart from competitors, but can also quickly create echo-chamber feeds of poor or harmful information regarding health.
Rather than actively diverting its impressionable demographic from videos that are undeniably damaging, TikTok makes it virtually impossible for users not to encounter dangerous themes when trawling through the FYP.
And, unlike other platforms such as Instagram and Tumblr, TikTok doesnβt refer users to mental health charities where they can access help, nor does it send resources to people searching pro-ED terms.
Alternatively, it either pulls up the rulebook for user behaviour, or simply says βno results found.β
Having come under fire for this time and time again during the last three years, TikTokβs latest move will see it ban such content outright using a combination of human and Artificial Intelligence moderation.
Seeking to ensure that nothing slips through the net going forward and that it can accurately tout itself as an inherently safe space for all users, itβll be cracking down on videos that promote even the slightest of disordered eating symptoms.