Menu Menu
[gtranslate]

TikTok is failing to protect young users

Sparking fresh concerns about the platform’s influence on children, new research has uncovered that its algorithm is pushing videos about eating disorders and self-harm to 13-year-olds.

According to research from the Centre for Countering Digital Hate (CCDH), TikTok appears to be pushing videos about eating disorders and self-harm to children as young as 13.

The study, which looked into the platform’s powerful algorithm, found accounts promoting suicide within 2.6 minutes, and others recommending explicit pro-thinness clips within eight.

Researchers discovered that the For You Pages of accounts they set up featuring characteristics of vulnerable teenagers were inundated with content of this nature twelve times more than standard accounts.

This has sparked fresh concerns about the app’s influence on impressionable users.

It comes as state and federal lawmakers in the US seek ways to crack down on TikTok over privacy and security fears, as well as determining whether or not the platform is appropriate for teens.

It also follows a series of investigations from September into TikTok’s failure to inform its 25% userbase of 10 to 19-year-olds that it had been processing their data without consent or the necessary legal grounds.

After which the company vowed to change, but the latest findings from the CCDH suggest otherwise.

‘The results are every parent’s nightmare: young people’s feeds are bombarded with harmful, harrowing content that can have a significant cumulative impact on their understanding of the world around them, and their physical and mental health,’ wrote Imran Ahmed, CEO of CCDH, in the report titled Deadly by Design.

‘This underscores the urgent need for reform of online spaces. Without oversight, TikTok’s opaque platform will continue to profit by serving its users – children as young as 13, remember – increasingly intense and distressing content without checks, resources or support.’

In response, TikTok has disputed the claims, stressing that the study’s methodology did not ‘reflect the genuine behaviour or viewing experiences of real people.’

As emphasised by one of its spokespeople, the app regularly consults with health experts, removes policy violations, and provides access to supportive resources for anyone in need of them.

‘We’re mindful that triggering content is unique to each individual and remain focused on fostering a safe and comfortable space for everyone, including people who choose to share their recovery journeys or educate others on these important topics,’ they said.

TikTok added that it continues to roll out new safeguards for its users, including parental controls and ways to filter out mature or ‘potentially problematic’ videos exploring ‘complex themes.’

Regardless, Ahmed believes that government officials should take the wheel on this issue.

‘This is a classic moment for regulators to step in and say we need to have some sort of non-proliferation agreement, that we won’t just create algorithms that are more and more addictive and more and more lethal,’ he says.

‘But we don’t have that, we have nothing, we have no guidance from the government or from regulators, from the FTC, from anywhere else.’

Accessibility