Menu Menu
[gtranslate]

Are tech companies failing to protect children?

With TikTok facing a £27m fine for endangering under-13s and a recent study uncovering that almost half of British children have seen harmful content online, we must address the threat posed by social media platforms and their blatant disregard for the wellbeing of young users. 

Earlier this week, news broke out that TikTok – with its 25% userbase of 10 to 19-year-olds – has been accused of processing the data of under-13s without appropriate consent or the necessary legal grounds.

The Information Commissioner’s Office (ICO) issued a provisional notice of intent, signalling a warning before a potential fine of £27m.

As stated by the ICO, TikTok failed to give the required information to this demographic in a concise, transparent, and easily understandable way between 2018 and 2020.

For this reason, the UK watchdog has embarked upon a series of investigations into 50 different firms providing digital services, which have reportedly not taken their responsibilities around child safety seriously enough.

‘We all want children to be able to learn and experience the digital world, but with proper data privacy protections,’ it said. ‘Companies providing digital services have a legal duty to put those protections in place, but our provisional view is that TikTok fell short of meeting that requirement.’

TikTok could face an eye-watering £27m fine for failing to protect children's privacy | Daily Mail Online

Now, while the ICO has yet to draw a final conclusion, this is not the first time TikTok has come under fire for disregarding the wellbeing of its young users.

Though the app has largely skirted the attention of regulators (with much of the focus reserved for Meta and Google) for some time now, as it grows in popularity, so too does the political scrutiny it’s being subjected to.

Particularly in the UK, where Ofcom has discovered 44% of eight to twelve-year-olds to be using TikTok, despite its policies forbidding anyone younger than 13 from making an account.

It isn’t just data breaches that pose a threat to their safety, however, because according to a study published last Thursday, almost half of British children have come across material they felt was harmful or made them worried and upset when scrolling.

This includes pornography, sexualised and violent imagery, anonymous trolling, and posts promoting diet restriction.

Molly Russell viewed depression-related Instagram post before taking her own life, court hears

The research was carried out following the death of 14-year-old Molly Russel, who killed herself in November 2017 after viewing excessive amounts of content online related to suicide, depression, anxiety, and self-harm.

A tragedy that Dame Rachel de Souza, the children’s commissioner for England, has said she fears could be repeated unless drastic action is taken immediately to improve self-regulation by tech companies.

‘This content shouldn’t be available to children of this age and the tech companies should be making sure it’s taken down,’ says De Souza.

‘Girls as young as nine told my team about strategies they employ when strangers ask for their home address online. In a room of 15- and 16-year-olds, three-quarters had been sent a video of a beheading.’

As she explains, children rarely seek out this content. Instead, it is promoted and offered up to them by highly complex recommendation algorithms, which are designed to capture and retain their attention.

World-leading measures to protect children from accessing pornography online - GOV.UK

And to make matters worse, due to the negligence of platforms that do little to respond to reports from children about such content, most have stopped reporting it altogether, claiming ‘there’s no point.’

The survey also looked into parents’ anxieties about this issue, with 67% of participants saying they were deeply concerned about the impact of harmful content on young minds.

On this note, De Souza is determined to ensure that – going forward – the online safety bill and the ICO’s Children’s Code works harder to guarantee that the tech giants profiting off young people are adequately using those funds to protect them from the immeasurable dangers the Internet presents.

‘Self-regulation by tech companies has failed; the experiences of so many children are testament to that,’ she finishes.

‘Yet we have an enormous opportunity to right these wrongs through the online safety bill and careful regulation by Ofcom. It is vital that the bill keeps children’s safety at its heart.’

Accessibility