Menu Menu

Co-Op faces legal issues over facial recognition technology

The Big Brother Watch is accusing shopping chain Co-Op of misusing facial recognition technology in ‘Orwellian’ policy that matches shoppers against a database without their consent.

Are you a frequent Co-Op shopper? It’s likely you’ve been inputted into a database via facial recognition technology.

Big Brother Watch has sent an official complaint to the Information Commissioner’s Office concerning biometric surveillance technology which is currently in use at 35 Co-Op stores.  A camera takes photos of incoming customers faces, which are analysed and converted into biometric data.

This is then compared with a database of people that have stolen from shops or been violent, at least according to the Co-Op.

News that the shopping chain is using technology this advanced to identify everyone who walks through its doors might sound a bit alarming.

A spokeswoman said that while the Co-Op has a ‘watch-list’ of problematic patrons, it does not have a comprehensive record of people with criminal convictions. Instead, it’s simply a reference to anyone who’s been aggressive or broken conduct rules inside Co-Op shops.

The Big Brother Watch isn’t having any of that, mind. It says that there isn’t solid enough legal ground for shops to be using such invasive cameras, describing the approach as ‘Orwellian in the extreme’.

It says that the supermarket was ‘adding customers to secret watch-lists with no due process’, noting that ‘shoppers can be spied on, blacklisted across multiple stores and denied food shopping despite being entirely innocent’.

‘This is a deeply unethical and frankly chilling way for any business to behave’.

All that may be true, but why is it legally dubious?

While the data collected from the cameras is wiped after being analysed, pictures are stored for 72 hours. The Big Brother Watch says that this level of high-tech recognition does not correlate with the severity of crime, being unnecessarily intrusive.

This is coupled with a lack of consent or knowledge from customers. Ask any regular supermarket shopper if they’re aware of being analysed and inputted into databases and most – if not all – will say no.

Co-Op says it has informative signs in its relevant stores, but not enough is being done to make the public aware of their rights and data handling according to the Big Brother Watch. Keep in mind that this face technology is also in use at Costcutter, Sports Direct, Spar, Nisa, and Frasers.

In a statement, the supermarket chain said that it takes its ‘responsibilities around the use of facial recognition extremely seriously’. It also explained that it ‘works hard to balance our customer rights with the need to protect our colleagues’.

The Big Brother Watch argues that this technology ‘does not being serious criminals to justice’. It also pointed out that analytical face databases ‘empower individual businesses to keep undesirable [people] out of their stores and move them somewhere else’.

This is a fair point, considering we don’t fully know how companies are interpreting data. Are there racial, societal, or economic biases at play, considering all three are significant worries and stresses for current AI implementation in other industries?

It is also private information that isn’t shared with legal or government bodies, meaning that Co-Op is not at liberty to openly demonstrate how or where its information goes. It also doesn’t serve the wider public – as pointed out in the complaint – and could easily be deemed overkill, all things considered.

We’ll have to see where this challenge goes. Its outcome may have wider implications for the use of such technology, and discourage other big chains and brands from hopping on the bandwagon.

It’s likely most of the public would rather not be constantly analysed, at the very least, right?