Menu Menu
[gtranslate]

UK grooming cases reach record highs amid online safety law delay

Amid the butting of heads between UK ministers and tech firms over end-to-end encryption, instances of child grooming have reached record highs in 2023.

The ongoing impasse over end-to-end encryption is having severe consequences in the UK.

In order to appease app users, an increasing number of tech firms are offering encrypted messaging services, which means that only the sender and recipient can view any exchange of content. Not even the tech firms themselves can pull up records.

While in a general sense, the vast majority of us enjoy the idea of privacy above all else, there is a grave element of risk that cannot be ignored.

Children’s charity the NSPCC has revealed that 34,000 crimes related to online grooming have been recorded since calls for better online safety laws first came about in 2017.

Citing data from 42 UK police forces, the organisation says that 6,350 such offences involving a child victim were recorded in 2022 alone, which represents an all-time high.

New research shows that during the six year period, a quarter of roughly 21,000 known victims were primary-school age children (under 12) and that 73% were targeted on Snapchat or Meta platforms.

While a consensus exists between tech companies and lawmakers that something needs to change, an apparent trade off in privacy and security has held up any meaningful progress.

The latest draft of the Online Safety Bill demands a backdoor into social media services that can be accessed exclusively by the authorities if needed.

Tech companies, meanwhile, are concerned that loosening any protections may provide a window of opportunity for hackers and data scrapers to wreak havoc with our sensitive information.

Social media platforms, by and large, see the alternative of developing their own safety precautions as preferable, using updates to tighten their grip on the spread of child sex abuse material (CSAM), while preventing kids from encountering other forms of harmful or age-restricted content.

β€˜We’ve developed over 30 features to support teens and their families, including parental supervision tools that let parents be more involved in how their teens use Instagram,’ explained an Instagram spokesperson.

Despite the individual efforts of companies, however, the data shows that the epidemic of online child grooming is growing worse – exacerbated by the smokescreen social media inadvertently provides.

The NSPCC’s chief executive, Sir Peter Wanless, said: β€˜Today’s research highlights the sheer scale of child abuse happening on social media and the human cost of fundamentally unsafe products.’

β€˜The number of offences must serve as a reminder of why the Online Safety Bill is so important and why the ground-breaking protections it will give children are desperately needed.’

The butting of heads between Silicon Valley giants and government regulators continues to persist, but an alleged Ofcom intervention could see industry-wide changes imposed in the coming months.

It’ll be interesting to see if the right balance can be struck to satisfy both entities. One thing is for sure, though, an all hands on deck approach is equivocally needed.

Accessibility