Menu Menu

Sharing ‘deepfake porn’ could soon be illegal in the UK

A government-backed review has warned that current laws do not go far enough to cover ‘disturbing and abusive new behaviours born in the smartphone era.’

During the past decade, numerous steps have been taken to confront the threat of sexual harassment online posed by evolving digital trends.

In 2015, revenge porn – the distribution of sexually explicit images or videos of someone without their consent – was made illegal in the UK.

Five years later, pornography sites began cracking down on non-consensual uploads by putting in place blanket bans on downloads and outright bans on unverified videos.

More recently, cyberflashing (made possible by mobile phones, social media, dating apps, and wireless image sharing features) became a criminal offence, with perpetrators now at risk of being punished with the same maximum sentence as indecent exposure.

All are moves that have been welcomed by activists tirelessly campaigning for better regulations in this sphere.

People can create deepfake porn of you 'with impunity' unless the law changes | Metro News

Concerningly, however, as technology advances, so do the ways in which it can be abused, and when one sinister behaviour is dealt with, another quickly emerges in its place.

One such example is the highly disturbing rise of non-consensual deepfake porn, whereby new Artificial Intelligence developments are allowing for people’s faces to be superimposed onto sexual images or photos, to create realistic content that they have never participated in.

Up until this point, little has been done to tackle the issue due to a loophole in British law which excludes images that were not originally private or sexual.

In short, if someone’s non-explicit image is merged with an explicit one and not shared with them with the intention of causing direct harm, it isn’t covered by a criminal offence.

Fortunately, this may be about to change because the Law Commission of England and Wales is calling for the law to be reformed to reflect the shifting landscape.

Just last week, the independent body proposed widening the motivations behind these crimes to include things like financial gain as well as extending automatic anonymity to all victims of intimate image abuse.

The new legal framework, as set out by the commission, would criminalise anyone who intentionally takes or shares intimate images without consent.

Sentences will also be tougher, with up to three years’ imprisonment for the most serious abuses, and lifetime anonymity offered to all victims of abuse.

‘Sharing intimate images of a person without their consent can be incredibly distressing and harmful for victims, with the experience often scarring them for life,’ Professor Penney Lewis, the law commissioner for criminal law, said in a statement.

‘Current laws on taking or sharing sexual or nude images of someone without their consent are inconsistent, based on a narrow set of motivations and do not go far enough to cover disturbing and abusive new behaviours born in the smartphone era.’

Image

Companies including Twitter, Reddit, and PornHub have already banned deepfake porn generated without consent. In the US, Virginia and California have also made it illegal, while Scotland has made its distribution illegal.

According to Lewis, the phenomenon has dramatic under-reporting in the UK as victims do not have anonymity under current laws, which ‘do not go far enough to cover disturbing and abusive new behaviours born in the smartphone era.’

Her hope, therefore, is that the review will go further to adequately protect victims in the digital age.

‘A change in the law is long overdue and it’s right that under these proposals, all perpetrators of these acts would face prosecution,’ she finishes.

‘Our new reforms for government will broaden the scope of the criminal law to ensure that no perpetrators of these deeply damaging acts can evade prosecution and that victims are given effective protection.’

Accessibility