Menu Menu
[gtranslate]

Grim revelations arise from two-year TikTok investigation

Previously redacted documents about TikTok’s practices have been mulled over by several news outlets. The grim revelations bring to question the platform’s duty of care and lax precautions, particularly when it comes to young users.

Two years ago, TikTok discovered some truly disgusting things were taking place on their platform.

Children as young as 14 had been stripping on livestream and taking requests from adults in exchange for gifts or digital currency.

Entrenched in a PR nightmare, the platform began an internal review to reflect on its practices and search for any other glaring failings. The report’s grim details, many of which were kept to itself in 2022, have now been revealed plunging TikTok back in the boiling pot.

Still facing the prospect of being exiled from the US, 14 attorney generals are now suing the vertical video app for allegedly using insidious, addictive mechanics specifically to get young people hooked.

While proportioning blame to social media for its role in worrying mental health statistics is complicated (legally speaking anyway), the state authorities assert that TikTok is prioritising growth and profits at all costs, most notably over child safety.

The bipartisan lawsuit alleges that an addiction to the platform can be developed in under 35 minutes of viewing – or 260 videos. Considering public data says the average user spends around 58 minutes per day on TikTok, and 10-19-year-olds dominate the viewing demographics, a jury would probably find any counter-argument flimsy at best.

‘Limits’ features touted as a way for parents to control their children’s screen time have been slated as a mere placation tool. TikTok admitted the limits had little impact, accounting for around a minute and a half drop in overall usage – 108.5 minutes per day to 107 minutes.

Obviously TikTok officials won’t have been thrilled at the prospect of deliberately culling their own engagement, but a dismissive comment within one document comes across as brash and insulting.

A project manager reportedly stated that the real goal behind ‘limits’ is ‘improving public trust in the TikTok platform via media coverage.’ In other words, let’s pay lip service to a legitimate criticism and forgo any sense of moral duty to young users. ‘Our goal is not to reduce the time spent,’ it read.

One mental health consideration that has undeniably become synonymous with social media is body image. It doesn’t make for good reading, then, that the algorithm allegedly prioritises content from ‘beautiful people.’

When TikTok’s main feed hosted a high volume of those not considered ‘beautiful’ by overtly conventional standards, the platform’s algorithm tweaked itself to amplify content from those the company did.

TikTok ‘took active steps to promote a narrow beauty norm even though it could negatively impact their Young Users,’ the Kentucky authorities wrote in the previously redacted documents. If true, the only thing ugly is TikTok’s attitude to body image and mental health, particularly where young people are concerned.

It goes without saying that TikTok were never, and will never be, bastions of child welfare; their infinite short-form buffet relies on dwindling attention spans to remain relevant. What is shocking, nonetheless, is the complete disregard for the subject evident in these reports.

One unnamed executive summed it up in stark terms, allegedly declaring that the reason children use TikTok is because the app’s algorithm keeps them from ‘sleep, eating, and moving around the room, and looking at someone in the eyes.’

Despite mounting evidence vindicating what many of us had already gathered about TikTok, the platform remains on the defensive. Its spokesperson, Alex Haurek, recently described the emerging documents as ‘outdated’ and damning quotes as ‘cherry picked’ and ‘misleading.’

Either way, regular folk – and more importantly, the courts – will decide on what the revelations from the report mean. Given the original redactions are the source of most outrage after being seen, I think one outcome is currently far more likely than the other.

Accessibility