Facial recognition software has been controversial since it’s conception, but news that it will be implemented into London’s security cameras has sparked widespread concern over its potential for abuse.
If you live in London, the CCTV capital of the world, you are captured by security cameras at least 300 times a day – from the moment you leave your home, on the commute into work, and back again. Anytime you are in a public space, you are being watched.
For some, this might seem immensely creepy – maybe even invasive. For others, it may bring about a sense of security in the event they become the victim of an unprovoked attack while out and about.
This type of public surveillance increased after 9/11, when international terrorism became a top security concern in the West.
As time goes by, investment into national security and anti-terrorism measures continue to increase, despite the fact that 96 percent of deaths motivated by terrorism occur in developing countries, where there has been long-term political instability and bouts of religious conflict.
The London Metropolitan police recently approved £3 million plans expand its surveillance capabilities to include facial recognition, specifically, Retrospective Facial Recognition Technology which pulls photos from a huge online database (made up of social media posts, old security footage, and other images) to compare against images of people caught on CCTV.
Despite a heightened awareness about potential attacks and a greater public concern about safety in public spaces – especially for women – many remain sceptical about incorporating facial recognition technology (FRT) into CCTV.
The key questions are: just because FRT is available, should it be used on a wide scale and how does it have the potential to be abused?
The argument in favour of facial recognition technology
In the interest of public safety, facial recognition technology has been utilised to monitor criminals’ movements for suspicious activity once they are released from incarceration.
It has also been successful in locating missing persons and children, even years after their disappearance with the help of digital ageing software which can predict what they might look like as adults.
But facial recognition technology finds its biggest advocate in identifying suspects who commit crimes in public. This is especially relevant, as over the last 5-10 years, politically motivated violence has increased as polarisation in modern society rises.
Reports show that in the West, 70 violent demonstrations were recorded in 2019 compared to just 19 in 2011. Even in the most surveilled city in the world, some violent offences occurring in the public sphere remain without prosecution – even when video evidence is available.
One example of this is the case of the ‘Putney Pusher’, when in 2017 an indistinguishable jogger pushed an unsuspecting woman in front of a moving bus.
Footage of the incident was caught on CCTV as well as the bus’s onboard cameras and was featured across virtually every news channel in the UK.
Despite these solid pieces of evidence, the man was never identified, leaving motivations for incident and the identity of the perpetrator to become a case for discussion amongst internet sleuths.
Many have suggested that if strong FRT was available at the time, this person could have been caught by police.
Could facial recognition technology become a slippery slope?
For those with concerns over the use of facial recognition, hesitancy doesn’t lie in tracking faces of the public for the practical reasons explored above.
Instead, advisors are concerned that access to a rich database of identities could lead to abuse of power.
A policy advisor at the European Digital Rights advocacy group stated that those in charge with monitoring can ‘in effect turn back the clock to see who you are, where you’ve been, what you have done and with whom, over many months or even years.’
She added that ‘the technology [has potential to] suppress people’s free expression, assembly and ability to live without fear.’
It’s a daunting possibility, for those who feel that they have the right to live their life freely with a high degree of personal privacy.
Who is to say that those with access to databases for facial recognition technology can be trusted not to misuse it to spy on people they know or digitally stalk members of the public?
Indeed, the argument for tracking politically motivated criminals and terrorists seems like a strong one, but data suggests that fear surrounding these events are intensified by over-reporting in the media.
Such events of are disproportionately documented compared to other, more common causes of death such as health complications, homicide, or road accidents. Acknowledging this, some may believe the occasional benefits of facial recognition technology would not outweigh their right to privacy.
A force to be reckoned with
As cities are adapted to become ‘smarter’ in line with the technologies we use in airports and the phones in our pockets, the widespread use of facial recognition technology in public spaces could become the next thing we come to accept as society.
We hardly bat an eyelid anymore when we are presented with targeted ads for a product we briefly discussed with our colleagues or housemates.
It’s safe to say we have accepted that data is constantly being collected on our many behaviours – even if we aren’t really sure how this even works.
But with any new form of technology, pre-empting the dangers behind its widespread use should be considered by the experts.
In a time where it’s never been more evident how authority can lead to abuse of power, the debate over how facial recognition technology is used and how it will be regulated will be crucial in years to come.
Deputy Editor & Content Partnership ManagerLondon, UK
I’m Jessica (She/Her). I’m the Deputy Editor & Content Partnership Manager at Thred. Originally from the island of Bermuda, I specialise in writing about ocean health and marine conservation, but you can also find me delving into pop culture, health and wellness, plus sustainability in the beauty and fashion industries. Follow me on Twitter, LinkedIn and drop me some ideas/feedback via email.
People continue to seek out ‘nudify’ websites to create explicit AI images without a person’s consent. A recent analysis of 85 such platforms revealed that, collectively, they could be making up to $36 million annually.
We’ve been writing about nonconsensual deepfakes for well over seven years now, and the problem doesn’t seem to be deescalating.
Piggybacking the widescale proliferation of generative AI capabilities, illicit websites continue to spawn under the creepy...
A week after Grok’s ‘MechaHitler’ debacle, the US government has announced a $200m contract with the AI platform to modernise the Defense Department.
The chaotic relationship between Donald Trump and Elon Musk is throwing out some ridiculous headlines, and this is just the latest.
If you’re chronically online, like me, you’ll be familiar with Grok’s peculiar crash out last week, in which the chatbot anointed itself ‘MechaHitler’ and generated a...
As global leaders become more concerned about the impact of smartphones on children’s development, many have moved to ban their use in schools. Nearly two years into its own ban, the Dutch government is measuring the results.
It’s no secret that smartphones are addictive, even to adults. The constant buzz of notifications tempts us out of the current moment, out of productivity, and social media apps steal time we’d otherwise...
Bitchat is a decentralised and encrypted chat service able to send and receive messages through Bluetooth signal – and without WiFi or phone service. Will it become an instant hit?
Twitter co-founder Jack Dorsey has just launched a new peer-to-peer massaging app that works without WiFi or phone service.
Bitchat instead utilises a phone’s Bluetooth signal to allow for messages to be sent and received between contacts in the same...
We use cookies to ensure that we give you the best experience on our website. If you continue to use this site we will assume that you are happy with it.