With TikTok facing a £27m fine for endangering under-13s and a recent study uncovering that almost half of British children have seen harmful content online, we must address the threat posed by social media platforms and their blatant disregard for the wellbeing of young users.
Earlier this week, news broke out that TikTok – with its 25% userbase of 10 to 19-year-olds – has been accused of processing the data of under-13s without appropriate consent or the necessary legal grounds.
The Information Commissioner’s Office (ICO) issued a provisional notice of intent, signalling a warning before a potential fine of £27m.
As stated by the ICO, TikTok failed to give the required information to this demographic in a concise, transparent, and easily understandable way between 2018 and 2020.
For this reason, the UK watchdog has embarked upon a series of investigations into 50 different firms providing digital services, which have reportedly not taken their responsibilities around child safety seriously enough.
‘We all want children to be able to learn and experience the digital world, but with proper data privacy protections,’ it said. ‘Companies providing digital services have a legal duty to put those protections in place, but our provisional view is that TikTok fell short of meeting that requirement.’
Now, while the ICO has yet to draw a final conclusion, this is not the first time TikTok has come under fire for disregarding the wellbeing of its young users.
Though the app has largely skirted the attention of regulators (with much of the focus reserved for Meta and Google) for some time now, as it grows in popularity, so too does the political scrutiny it’s being subjected to.
Particularly in the UK, where Ofcom has discovered 44% of eight to twelve-year-olds to be using TikTok, despite its policies forbidding anyone younger than 13 from making an account.
It isn’t just data breaches that pose a threat to their safety, however, because according to a study published last Thursday, almost half of British children have come across material they felt was harmful or made them worried and upset when scrolling.